[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11124 1726882359.04925: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Xyq executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11124 1726882359.05544: Added group all to inventory 11124 1726882359.05546: Added group ungrouped to inventory 11124 1726882359.05550: Group all now contains ungrouped 11124 1726882359.05554: Examining possible inventory source: /tmp/network-91m/inventory.yml 11124 1726882359.33899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11124 1726882359.34083: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11124 1726882359.34108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11124 1726882359.34305: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11124 1726882359.34499: Loaded config def from plugin (inventory/script) 11124 1726882359.34502: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11124 1726882359.34544: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11124 1726882359.35069: Loaded config def from plugin (inventory/yaml) 11124 1726882359.35071: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11124 1726882359.35240: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11124 1726882359.35876: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11124 1726882359.35879: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11124 1726882359.35882: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11124 1726882359.35887: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11124 1726882359.35891: Loading data from /tmp/network-91m/inventory.yml 11124 1726882359.35953: /tmp/network-91m/inventory.yml was not parsable by auto 11124 1726882359.36012: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11124 1726882359.36053: Loading data from /tmp/network-91m/inventory.yml 11124 1726882359.36124: group all already in inventory 11124 1726882359.36131: set inventory_file for managed_node1 11124 1726882359.36135: set inventory_dir for managed_node1 11124 1726882359.36136: Added host managed_node1 to inventory 11124 1726882359.36138: Added host managed_node1 to group all 11124 1726882359.36139: set ansible_host for managed_node1 11124 1726882359.36139: set ansible_ssh_extra_args for managed_node1 11124 1726882359.36147: set inventory_file for managed_node2 11124 1726882359.36150: set inventory_dir for managed_node2 11124 1726882359.36151: Added host managed_node2 to inventory 11124 1726882359.36152: Added host managed_node2 to group all 11124 1726882359.36153: set ansible_host for managed_node2 11124 1726882359.36154: set ansible_ssh_extra_args for managed_node2 11124 1726882359.36156: set inventory_file for managed_node3 11124 1726882359.36158: set inventory_dir for managed_node3 11124 1726882359.36159: Added host managed_node3 to inventory 11124 1726882359.36160: Added host managed_node3 to group all 11124 1726882359.36160: set ansible_host for managed_node3 11124 1726882359.36161: set ansible_ssh_extra_args for managed_node3 11124 1726882359.36165: Reconcile groups and hosts in inventory. 11124 1726882359.36168: Group ungrouped now contains managed_node1 11124 1726882359.36170: Group ungrouped now contains managed_node2 11124 1726882359.36171: Group ungrouped now contains managed_node3 11124 1726882359.36243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11124 1726882359.36384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11124 1726882359.36430: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11124 1726882359.36457: Loaded config def from plugin (vars/host_group_vars) 11124 1726882359.36459: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11124 1726882359.36471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11124 1726882359.36479: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11124 1726882359.36517: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11124 1726882359.36841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882359.36937: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11124 1726882359.36980: Loaded config def from plugin (connection/local) 11124 1726882359.36983: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11124 1726882359.37739: Loaded config def from plugin (connection/paramiko_ssh) 11124 1726882359.37743: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11124 1726882359.38995: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11124 1726882359.39198: Loaded config def from plugin (connection/psrp) 11124 1726882359.39202: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11124 1726882359.40118: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11124 1726882359.40158: Loaded config def from plugin (connection/ssh) 11124 1726882359.40161: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11124 1726882359.42815: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11124 1726882359.42865: Loaded config def from plugin (connection/winrm) 11124 1726882359.42868: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11124 1726882359.42909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11124 1726882359.43230: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11124 1726882359.43307: Loaded config def from plugin (shell/cmd) 11124 1726882359.43309: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11124 1726882359.43338: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11124 1726882359.43415: Loaded config def from plugin (shell/powershell) 11124 1726882359.43418: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11124 1726882359.43580: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11124 1726882359.44490: Loaded config def from plugin (shell/sh) 11124 1726882359.44492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11124 1726882359.44531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11124 1726882359.44787: Loaded config def from plugin (become/runas) 11124 1726882359.44789: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11124 1726882359.44989: Loaded config def from plugin (become/su) 11124 1726882359.44991: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11124 1726882359.45153: Loaded config def from plugin (become/sudo) 11124 1726882359.45155: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11124 1726882359.45190: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 11124 1726882359.45961: in VariableManager get_vars() 11124 1726882359.46385: done with get_vars() 11124 1726882359.46525: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11124 1726882359.55048: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11124 1726882359.55440: in VariableManager get_vars() 11124 1726882359.55446: done with get_vars() 11124 1726882359.55451: variable 'playbook_dir' from source: magic vars 11124 1726882359.55452: variable 'ansible_playbook_python' from source: magic vars 11124 1726882359.55453: variable 'ansible_config_file' from source: magic vars 11124 1726882359.55454: variable 'groups' from source: magic vars 11124 1726882359.55454: variable 'omit' from source: magic vars 11124 1726882359.55455: variable 'ansible_version' from source: magic vars 11124 1726882359.55456: variable 'ansible_check_mode' from source: magic vars 11124 1726882359.55457: variable 'ansible_diff_mode' from source: magic vars 11124 1726882359.55458: variable 'ansible_forks' from source: magic vars 11124 1726882359.55458: variable 'ansible_inventory_sources' from source: magic vars 11124 1726882359.55459: variable 'ansible_skip_tags' from source: magic vars 11124 1726882359.55460: variable 'ansible_limit' from source: magic vars 11124 1726882359.55461: variable 'ansible_run_tags' from source: magic vars 11124 1726882359.55461: variable 'ansible_verbosity' from source: magic vars 11124 1726882359.55802: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml 11124 1726882359.57331: in VariableManager get_vars() 11124 1726882359.57349: done with get_vars() 11124 1726882359.57359: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11124 1726882359.59302: in VariableManager get_vars() 11124 1726882359.59318: done with get_vars() 11124 1726882359.59328: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11124 1726882359.59472: in VariableManager get_vars() 11124 1726882359.59502: done with get_vars() 11124 1726882359.59655: in VariableManager get_vars() 11124 1726882359.59673: done with get_vars() 11124 1726882359.59682: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11124 1726882359.59755: in VariableManager get_vars() 11124 1726882359.59772: done with get_vars() 11124 1726882359.60076: in VariableManager get_vars() 11124 1726882359.60090: done with get_vars() 11124 1726882359.60095: variable 'omit' from source: magic vars 11124 1726882359.60112: variable 'omit' from source: magic vars 11124 1726882359.60155: in VariableManager get_vars() 11124 1726882359.60168: done with get_vars() 11124 1726882359.60223: in VariableManager get_vars() 11124 1726882359.60234: done with get_vars() 11124 1726882359.60283: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11124 1726882359.60517: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11124 1726882359.60651: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11124 1726882359.61689: in VariableManager get_vars() 11124 1726882359.61709: done with get_vars() 11124 1726882359.62238: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 11124 1726882359.62411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11124 1726882359.64324: in VariableManager get_vars() 11124 1726882359.64343: done with get_vars() 11124 1726882359.64356: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11124 1726882359.64445: in VariableManager get_vars() 11124 1726882359.64468: done with get_vars() 11124 1726882359.64588: in VariableManager get_vars() 11124 1726882359.64604: done with get_vars() 11124 1726882359.65881: in VariableManager get_vars() 11124 1726882359.65898: done with get_vars() 11124 1726882359.65903: variable 'omit' from source: magic vars 11124 1726882359.65927: variable 'omit' from source: magic vars 11124 1726882359.65965: in VariableManager get_vars() 11124 1726882359.65979: done with get_vars() 11124 1726882359.66000: in VariableManager get_vars() 11124 1726882359.66015: done with get_vars() 11124 1726882359.66049: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11124 1726882359.66181: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11124 1726882359.66262: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11124 1726882359.66955: in VariableManager get_vars() 11124 1726882359.66977: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11124 1726882359.70755: in VariableManager get_vars() 11124 1726882359.70781: done with get_vars() 11124 1726882359.70799: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11124 1726882359.73315: in VariableManager get_vars() 11124 1726882359.73336: done with get_vars() 11124 1726882359.73396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11124 1726882359.73419: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11124 1726882359.73671: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11124 1726882359.73843: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11124 1726882359.73846: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 11124 1726882359.73880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11124 1726882359.73911: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11124 1726882359.74102: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11124 1726882359.74162: Loaded config def from plugin (callback/default) 11124 1726882359.74169: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11124 1726882359.75461: Loaded config def from plugin (callback/junit) 11124 1726882359.75466: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11124 1726882359.75517: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11124 1726882359.75588: Loaded config def from plugin (callback/minimal) 11124 1726882359.75591: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11124 1726882359.75633: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11124 1726882359.75699: Loaded config def from plugin (callback/tree) 11124 1726882359.75701: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11124 1726882359.75829: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11124 1726882359.75831: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Xyq/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_deprecated_nm.yml ***************************************** 2 plays in /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 11124 1726882359.75857: in VariableManager get_vars() 11124 1726882359.75873: done with get_vars() 11124 1726882359.75879: in VariableManager get_vars() 11124 1726882359.75888: done with get_vars() 11124 1726882359.75892: variable 'omit' from source: magic vars 11124 1726882359.75937: in VariableManager get_vars() 11124 1726882359.75951: done with get_vars() 11124 1726882359.75974: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_deprecated.yml' with nm as provider] *** 11124 1726882359.77295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11124 1726882359.78302: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11124 1726882359.79283: getting the remaining hosts for this loop 11124 1726882359.79285: done getting the remaining hosts for this loop 11124 1726882359.79287: getting the next task for host managed_node1 11124 1726882359.79291: done getting next task for host managed_node1 11124 1726882359.79293: ^ task is: TASK: Gathering Facts 11124 1726882359.79295: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882359.79297: getting variables 11124 1726882359.79298: in VariableManager get_vars() 11124 1726882359.79308: Calling all_inventory to load vars for managed_node1 11124 1726882359.79310: Calling groups_inventory to load vars for managed_node1 11124 1726882359.79312: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882359.79323: Calling all_plugins_play to load vars for managed_node1 11124 1726882359.79332: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882359.79335: Calling groups_plugins_play to load vars for managed_node1 11124 1726882359.79373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882359.79428: done with get_vars() 11124 1726882359.79435: done getting variables 11124 1726882359.79505: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 Friday 20 September 2024 21:32:39 -0400 (0:00:00.037) 0:00:00.037 ****** 11124 1726882359.79527: entering _queue_task() for managed_node1/gather_facts 11124 1726882359.79529: Creating lock for gather_facts 11124 1726882359.79857: worker is 1 (out of 1 available) 11124 1726882359.79869: exiting _queue_task() for managed_node1/gather_facts 11124 1726882359.79891: done queuing things up, now waiting for results queue to drain 11124 1726882359.79893: waiting for pending results... 11124 1726882359.80905: running TaskExecutor() for managed_node1/TASK: Gathering Facts 11124 1726882359.81067: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000cd 11124 1726882359.81082: variable 'ansible_search_path' from source: unknown 11124 1726882359.81178: calling self._execute() 11124 1726882359.82054: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882359.82058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882359.82067: variable 'omit' from source: magic vars 11124 1726882359.82392: variable 'omit' from source: magic vars 11124 1726882359.82421: variable 'omit' from source: magic vars 11124 1726882359.82577: variable 'omit' from source: magic vars 11124 1726882359.82621: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882359.82973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882359.82999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882359.83017: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882359.83028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882359.83058: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882359.83061: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882359.83067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882359.83180: Set connection var ansible_shell_executable to /bin/sh 11124 1726882359.83188: Set connection var ansible_shell_type to sh 11124 1726882359.83205: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882359.83212: Set connection var ansible_timeout to 10 11124 1726882359.83217: Set connection var ansible_pipelining to False 11124 1726882359.83220: Set connection var ansible_connection to ssh 11124 1726882359.83244: variable 'ansible_shell_executable' from source: unknown 11124 1726882359.83250: variable 'ansible_connection' from source: unknown 11124 1726882359.83253: variable 'ansible_module_compression' from source: unknown 11124 1726882359.83255: variable 'ansible_shell_type' from source: unknown 11124 1726882359.83258: variable 'ansible_shell_executable' from source: unknown 11124 1726882359.83261: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882359.83263: variable 'ansible_pipelining' from source: unknown 11124 1726882359.83268: variable 'ansible_timeout' from source: unknown 11124 1726882359.83270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882359.83674: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882359.83683: variable 'omit' from source: magic vars 11124 1726882359.83688: starting attempt loop 11124 1726882359.83691: running the handler 11124 1726882359.83706: variable 'ansible_facts' from source: unknown 11124 1726882359.83726: _low_level_execute_command(): starting 11124 1726882359.83732: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882359.86416: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882359.86427: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882359.86439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882359.86490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882359.86531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882359.86590: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882359.86602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882359.86621: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882359.86630: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882359.86637: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882359.86645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882359.86656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882359.86670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882359.86677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882359.86685: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882359.86698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882359.86890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882359.86914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882359.86938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882359.87076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882359.88749: stdout chunk (state=3): >>>/root <<< 11124 1726882359.88877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882359.88956: stderr chunk (state=3): >>><<< 11124 1726882359.88959: stdout chunk (state=3): >>><<< 11124 1726882359.89082: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882359.89089: _low_level_execute_command(): starting 11124 1726882359.89092: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949 `" && echo ansible-tmp-1726882359.8898108-11159-921097534949="` echo /root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949 `" ) && sleep 0' 11124 1726882359.91053: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882359.91079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882359.91099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882359.91118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882359.91160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882359.91176: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882359.91193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882359.91215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882359.91227: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882359.91239: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882359.91251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882359.91267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882359.91284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882359.91297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882359.91312: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882359.91327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882359.91404: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882359.91431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882359.91448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882359.91579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882359.93471: stdout chunk (state=3): >>>ansible-tmp-1726882359.8898108-11159-921097534949=/root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949 <<< 11124 1726882359.93775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882359.93779: stdout chunk (state=3): >>><<< 11124 1726882359.93781: stderr chunk (state=3): >>><<< 11124 1726882359.93784: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882359.8898108-11159-921097534949=/root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882359.93786: variable 'ansible_module_compression' from source: unknown 11124 1726882359.93788: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11124 1726882359.93790: ANSIBALLZ: Acquiring lock 11124 1726882359.93792: ANSIBALLZ: Lock acquired: 139628947188928 11124 1726882359.93794: ANSIBALLZ: Creating module 11124 1726882360.76331: ANSIBALLZ: Writing module into payload 11124 1726882360.77437: ANSIBALLZ: Writing module 11124 1726882360.77811: ANSIBALLZ: Renaming module 11124 1726882360.77823: ANSIBALLZ: Done creating module 11124 1726882360.77866: variable 'ansible_facts' from source: unknown 11124 1726882360.77879: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882360.77893: _low_level_execute_command(): starting 11124 1726882360.77903: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11124 1726882360.79827: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882360.80584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882360.80587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882360.80897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882360.80901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882360.80913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882360.81108: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882360.82744: stdout chunk (state=3): >>>PLATFORM <<< 11124 1726882360.82829: stdout chunk (state=3): >>>Linux <<< 11124 1726882360.82847: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 <<< 11124 1726882360.82862: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 11124 1726882360.82995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882360.83085: stderr chunk (state=3): >>><<< 11124 1726882360.83088: stdout chunk (state=3): >>><<< 11124 1726882360.83226: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882360.83237 [managed_node1]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 11124 1726882360.83241: _low_level_execute_command(): starting 11124 1726882360.83243: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 11124 1726882360.83600: Sending initial data 11124 1726882360.83603: Sent initial data (1181 bytes) 11124 1726882360.84711: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882360.84715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882360.84878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882360.84882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 11124 1726882360.84885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882360.84887: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882360.85156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882360.85184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882360.85320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882360.89057: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11124 1726882360.89419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882360.89569: stderr chunk (state=3): >>><<< 11124 1726882360.89573: stdout chunk (state=3): >>><<< 11124 1726882360.89576: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882360.89869: variable 'ansible_facts' from source: unknown 11124 1726882360.89872: variable 'ansible_facts' from source: unknown 11124 1726882360.89874: variable 'ansible_module_compression' from source: unknown 11124 1726882360.89876: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11124 1726882360.89879: variable 'ansible_facts' from source: unknown 11124 1726882360.89881: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949/AnsiballZ_setup.py 11124 1726882360.90017: Sending initial data 11124 1726882360.90027: Sent initial data (151 bytes) 11124 1726882360.91406: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882360.91410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882360.91438: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882360.91442: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882360.91452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882360.91527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882360.91559: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882360.91777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882360.93436: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882360.93530: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882360.93631: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpyo8zm4tg /root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949/AnsiballZ_setup.py <<< 11124 1726882360.93722: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882360.96897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882360.97204: stderr chunk (state=3): >>><<< 11124 1726882360.97208: stdout chunk (state=3): >>><<< 11124 1726882360.97210: done transferring module to remote 11124 1726882360.97212: _low_level_execute_command(): starting 11124 1726882360.97218: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949/ /root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949/AnsiballZ_setup.py && sleep 0' 11124 1726882360.99781: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882360.99798: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882360.99933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882360.99955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882361.00022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882361.00038: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882361.00055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882361.00076: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882361.00089: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882361.00103: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882361.00114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882361.00126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882361.00151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882361.00165: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882361.00177: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882361.00190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882361.00392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882361.00410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882361.00424: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882361.00699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882361.02498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882361.02502: stdout chunk (state=3): >>><<< 11124 1726882361.02505: stderr chunk (state=3): >>><<< 11124 1726882361.02612: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882361.02616: _low_level_execute_command(): starting 11124 1726882361.02619: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949/AnsiballZ_setup.py && sleep 0' 11124 1726882361.05522: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882361.05527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882361.05559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882361.05563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882361.05573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882361.05627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882361.06026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882361.06029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882361.06130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882361.08190: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 11124 1726882361.08210: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11124 1726882361.08306: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 11124 1726882361.08334: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 11124 1726882361.08381: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 11124 1726882361.08387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882361.08440: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 11124 1726882361.08444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 11124 1726882361.08446: stdout chunk (state=3): >>>import '_codecs' # <<< 11124 1726882361.08476: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e16d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 11124 1726882361.08506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e16d8b20> <<< 11124 1726882361.08533: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 11124 1726882361.08598: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e16d8ac0> import '_signal' # <<< 11124 1726882361.08602: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 11124 1726882361.08608: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d490> <<< 11124 1726882361.08645: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 11124 1726882361.08650: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 11124 1726882361.08653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 11124 1726882361.08694: stdout chunk (state=3): >>>import '_abc' # <<< 11124 1726882361.08697: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d940> <<< 11124 1726882361.08703: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d670> <<< 11124 1726882361.08737: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 11124 1726882361.08806: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 11124 1726882361.08873: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1634190> <<< 11124 1726882361.08875: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 11124 1726882361.08877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 11124 1726882361.08970: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1634220> <<< 11124 1726882361.08973: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 11124 1726882361.09014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 11124 1726882361.09017: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1657850> <<< 11124 1726882361.09021: stdout chunk (state=3): >>>import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1634940> <<< 11124 1726882361.09033: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1695880> <<< 11124 1726882361.09077: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 11124 1726882361.09079: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e162dd90> <<< 11124 1726882361.09141: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # <<< 11124 1726882361.09144: stdout chunk (state=3): >>>import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1657d90> <<< 11124 1726882361.09242: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11124 1726882361.09559: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 11124 1726882361.09564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 11124 1726882361.09598: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 11124 1726882361.09601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 11124 1726882361.09618: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 11124 1726882361.09645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 11124 1726882361.09651: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 11124 1726882361.09665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13d4eb0> <<< 11124 1726882361.09718: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13d6f40> <<< 11124 1726882361.09735: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 11124 1726882361.09752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 11124 1726882361.09771: stdout chunk (state=3): >>>import '_sre' # <<< 11124 1726882361.09790: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 11124 1726882361.09812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 11124 1726882361.09832: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 11124 1726882361.09846: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13cc610> <<< 11124 1726882361.09877: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13d2640> <<< 11124 1726882361.09889: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13d4370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 11124 1726882361.09960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 11124 1726882361.09976: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 11124 1726882361.10010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882361.10038: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 11124 1726882361.10057: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882361.10076: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e12b9dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12b98b0> <<< 11124 1726882361.10096: stdout chunk (state=3): >>>import 'itertools' # <<< 11124 1726882361.10120: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12b9eb0> <<< 11124 1726882361.10135: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 11124 1726882361.10155: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 11124 1726882361.10179: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12b9f70> <<< 11124 1726882361.10201: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 11124 1726882361.10213: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12b9e80> import '_collections' # <<< 11124 1726882361.10267: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13aed30> <<< 11124 1726882361.10284: stdout chunk (state=3): >>>import '_functools' # <<< 11124 1726882361.10300: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13a7610> <<< 11124 1726882361.10351: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 11124 1726882361.10374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13bb670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13dae20> <<< 11124 1726882361.10394: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 11124 1726882361.10407: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e12cbc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13ae250> <<< 11124 1726882361.10457: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e13bb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13e09d0> <<< 11124 1726882361.10574: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cbfa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cbd90> <<< 11124 1726882361.10607: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cbd00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 11124 1726882361.10621: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 11124 1726882361.10637: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 11124 1726882361.10670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 11124 1726882361.10712: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 11124 1726882361.10744: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e129e370> <<< 11124 1726882361.10777: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 11124 1726882361.10809: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e129e460> <<< 11124 1726882361.10927: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12d3fa0> <<< 11124 1726882361.10973: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cda30> <<< 11124 1726882361.10980: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cd490> <<< 11124 1726882361.11006: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 11124 1726882361.11009: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 11124 1726882361.11052: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 11124 1726882361.11060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 11124 1726882361.11091: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 11124 1726882361.11104: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11d21c0> <<< 11124 1726882361.11119: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1289c70> <<< 11124 1726882361.11173: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cdeb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13e0040> <<< 11124 1726882361.11198: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 11124 1726882361.11219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 11124 1726882361.11261: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 11124 1726882361.11265: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11e4af0> <<< 11124 1726882361.11272: stdout chunk (state=3): >>>import 'errno' # <<< 11124 1726882361.11295: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11e4e20> <<< 11124 1726882361.11355: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 11124 1726882361.11361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 11124 1726882361.11369: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 11124 1726882361.11372: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 11124 1726882361.11374: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11f6730> <<< 11124 1726882361.11389: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 11124 1726882361.11406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 11124 1726882361.11436: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11f6c70> <<< 11124 1726882361.11490: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882361.11497: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e118e3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11e4f10> <<< 11124 1726882361.11512: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 11124 1726882361.11566: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e119f280> <<< 11124 1726882361.11570: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11f65b0> import 'pwd' # <<< 11124 1726882361.11603: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e119f340> <<< 11124 1726882361.11642: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cb9d0> <<< 11124 1726882361.11675: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 11124 1726882361.11678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 11124 1726882361.11697: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 11124 1726882361.11719: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 11124 1726882361.11735: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11ba6a0> <<< 11124 1726882361.11759: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py <<< 11124 1726882361.11783: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882361.11795: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11ba970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11ba760> <<< 11124 1726882361.11823: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11ba850> <<< 11124 1726882361.11852: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 11124 1726882361.12050: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11baca0> <<< 11124 1726882361.12101: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11c71f0> <<< 11124 1726882361.12104: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11ba8e0> <<< 11124 1726882361.12110: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11aea30> <<< 11124 1726882361.12130: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cb5b0> <<< 11124 1726882361.12155: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 11124 1726882361.12210: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 11124 1726882361.12239: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11baa90> <<< 11124 1726882361.12386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 11124 1726882361.12398: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f36e10e4670> <<< 11124 1726882361.12662: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 11124 1726882361.12755: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.12821: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py <<< 11124 1726882361.12824: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.12827: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.12829: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 11124 1726882361.12841: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.14043: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.14962: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f787f0> <<< 11124 1726882361.14988: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882361.15019: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 11124 1726882361.15033: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 11124 1726882361.15061: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882361.15066: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e1009760> <<< 11124 1726882361.15092: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009640> <<< 11124 1726882361.15134: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009370> <<< 11124 1726882361.15154: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 11124 1726882361.15194: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009490> <<< 11124 1726882361.15197: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009190> import 'atexit' # <<< 11124 1726882361.15243: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e1009400> <<< 11124 1726882361.15246: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 11124 1726882361.15272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 11124 1726882361.15313: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e10097c0> <<< 11124 1726882361.15340: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 11124 1726882361.15350: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 11124 1726882361.15377: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 11124 1726882361.15385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 11124 1726882361.15402: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 11124 1726882361.15485: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fe27c0> <<< 11124 1726882361.15520: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0fe2b50> <<< 11124 1726882361.15578: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882361.15581: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0fe29a0> <<< 11124 1726882361.15584: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 11124 1726882361.15586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 11124 1726882361.15617: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e09c74f0> <<< 11124 1726882361.15629: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1002d30> <<< 11124 1726882361.15794: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009520> <<< 11124 1726882361.15827: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 11124 1726882361.15866: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1002190> <<< 11124 1726882361.15869: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 11124 1726882361.15872: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 11124 1726882361.15897: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 11124 1726882361.15921: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 11124 1726882361.15960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 11124 1726882361.15965: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1033a90> <<< 11124 1726882361.16045: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fd6190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fd6790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e09ccd00> <<< 11124 1726882361.16070: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0fd66a0> <<< 11124 1726882361.16098: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1057d30> <<< 11124 1726882361.16128: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 11124 1726882361.16135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 11124 1726882361.16159: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 11124 1726882361.16186: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 11124 1726882361.16263: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f599a0> <<< 11124 1726882361.16266: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1062e50> <<< 11124 1726882361.16292: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 11124 1726882361.16345: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f690d0> <<< 11124 1726882361.16352: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1062e20> <<< 11124 1726882361.16370: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 11124 1726882361.16405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882361.16436: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 11124 1726882361.16439: stdout chunk (state=3): >>>import '_string' # <<< 11124 1726882361.16494: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1069220> <<< 11124 1726882361.16624: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f69100> <<< 11124 1726882361.16719: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882361.16728: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e102db80> <<< 11124 1726882361.16753: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e1062ac0> <<< 11124 1726882361.16791: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882361.16794: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e1062d00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e10e4820> <<< 11124 1726882361.16828: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 11124 1726882361.16844: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 11124 1726882361.16866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 11124 1726882361.16913: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f650d0> <<< 11124 1726882361.17089: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f5b370> <<< 11124 1726882361.17138: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f65d00> <<< 11124 1726882361.17144: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f656a0> <<< 11124 1726882361.17156: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f66130> <<< 11124 1726882361.17159: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.17173: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 11124 1726882361.17249: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.17337: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.17341: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 11124 1726882361.17376: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11124 1726882361.17380: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 11124 1726882361.17382: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.17476: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.17572: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.18010: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.18469: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py <<< 11124 1726882361.18483: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 11124 1726882361.18517: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 11124 1726882361.18521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882361.18565: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f768b0> <<< 11124 1726882361.18645: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py <<< 11124 1726882361.18651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fa3910> <<< 11124 1726882361.18666: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05c96a0> <<< 11124 1726882361.18695: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 11124 1726882361.18719: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.18726: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.18739: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 11124 1726882361.18875: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.18995: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 11124 1726882361.19024: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fe07f0> <<< 11124 1726882361.19028: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.19437: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.19783: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.19835: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.19907: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py <<< 11124 1726882361.19913: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.19940: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.19978: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 11124 1726882361.19981: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20035: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20127: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 11124 1726882361.20134: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20138: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 11124 1726882361.20159: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20180: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20217: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 11124 1726882361.20220: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20408: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20601: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 11124 1726882361.20628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 11124 1726882361.20631: stdout chunk (state=3): >>>import '_ast' # <<< 11124 1726882361.20705: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05ced90> # zipimport: zlib available <<< 11124 1726882361.20768: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20851: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py <<< 11124 1726882361.20854: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 11124 1726882361.20861: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 11124 1726882361.20875: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20909: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20950: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 11124 1726882361.20954: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.20982: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.21026: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.21122: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.21186: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 11124 1726882361.21200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882361.21276: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f940a0> <<< 11124 1726882361.21365: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0597070> <<< 11124 1726882361.21397: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py <<< 11124 1726882361.21400: stdout chunk (state=3): >>>import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 11124 1726882361.21456: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.21507: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.21529: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.21583: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 11124 1726882361.21588: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 11124 1726882361.21601: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 11124 1726882361.21643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 11124 1726882361.21646: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 11124 1726882361.21677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 11124 1726882361.21753: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f9d160> <<< 11124 1726882361.21796: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f9acd0> <<< 11124 1726882361.21853: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05cebb0> <<< 11124 1726882361.21859: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 11124 1726882361.21880: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.21907: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py <<< 11124 1726882361.21910: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 11124 1726882361.21983: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 11124 1726882361.22006: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 11124 1726882361.22009: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22069: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22126: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22145: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22159: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22195: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22229: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22265: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22301: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 11124 1726882361.22306: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22367: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22435: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22462: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22479: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 11124 1726882361.22628: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22769: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22805: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.22878: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 11124 1726882361.22884: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882361.22890: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 11124 1726882361.22893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 11124 1726882361.22912: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 11124 1726882361.22960: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0349a60> <<< 11124 1726882361.22967: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 11124 1726882361.22970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 11124 1726882361.22984: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 11124 1726882361.23004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 11124 1726882361.23040: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 11124 1726882361.23043: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 11124 1726882361.23058: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05aa6d0> <<< 11124 1726882361.23090: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882361.23093: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e05aaaf0> <<< 11124 1726882361.23157: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e058f250> <<< 11124 1726882361.23195: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e058fa30> <<< 11124 1726882361.23198: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05de460> <<< 11124 1726882361.23201: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05de910> <<< 11124 1726882361.23229: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 11124 1726882361.23232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 11124 1726882361.23269: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py <<< 11124 1726882361.23272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 11124 1726882361.23299: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e05dbd00> <<< 11124 1726882361.23302: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05dbd60> <<< 11124 1726882361.23326: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 11124 1726882361.23331: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 11124 1726882361.23357: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05db250> <<< 11124 1726882361.23385: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 11124 1726882361.23388: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 11124 1726882361.23418: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e03b1f70> <<< 11124 1726882361.23452: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05f34c0> <<< 11124 1726882361.23486: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05de310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 11124 1726882361.23520: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11124 1726882361.23524: stdout chunk (state=3): >>>import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 11124 1726882361.23526: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.23584: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.23641: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 11124 1726882361.23676: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.23718: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 11124 1726882361.23739: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 11124 1726882361.23758: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.23778: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.23808: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 11124 1726882361.23823: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.23860: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.23903: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py <<< 11124 1726882361.23914: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.23939: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.23982: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available <<< 11124 1726882361.24041: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.24084: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.24135: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.24191: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py <<< 11124 1726882361.24194: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 11124 1726882361.24581: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.24942: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 11124 1726882361.24945: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.24982: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25035: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25060: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25099: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 11124 1726882361.25102: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25122: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25156: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 11124 1726882361.25159: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25208: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25270: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 11124 1726882361.25273: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25284: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25317: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 11124 1726882361.25320: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25349: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25380: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 11124 1726882361.25383: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25443: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25520: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 11124 1726882361.25537: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e02bfca0> <<< 11124 1726882361.25586: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 11124 1726882361.25589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 11124 1726882361.25743: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e02bffd0> <<< 11124 1726882361.25746: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 11124 1726882361.25799: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25865: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 11124 1726882361.25869: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.25931: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.26012: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 11124 1726882361.26073: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.26150: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 11124 1726882361.26154: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.26182: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.26223: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 11124 1726882361.26238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 11124 1726882361.26388: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e02bd370> <<< 11124 1726882361.26623: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e030cbb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 11124 1726882361.26626: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.26671: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.26719: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available <<< 11124 1726882361.26801: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.26868: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.26961: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27097: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py <<< 11124 1726882361.27105: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available <<< 11124 1726882361.27134: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27173: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 11124 1726882361.27177: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27203: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27253: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 11124 1726882361.27306: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0243160> <<< 11124 1726882361.27310: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e02432b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 11124 1726882361.27339: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 11124 1726882361.27342: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27381: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27419: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 11124 1726882361.27424: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27566: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27694: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 11124 1726882361.27698: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27770: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27847: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27880: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27914: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 11124 1726882361.27935: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.27997: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.28021: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.28134: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.28262: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 11124 1726882361.28265: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.28366: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.28473: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 11124 1726882361.28505: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.28523: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.28957: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.29361: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py <<< 11124 1726882361.29367: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 11124 1726882361.29682: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available <<< 11124 1726882361.29707: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 11124 1726882361.29845: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30003: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 11124 1726882361.30006: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30008: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 11124 1726882361.30010: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30037: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30088: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 11124 1726882361.30091: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30173: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30254: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30426: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30592: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 11124 1726882361.30612: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30627: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30675: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 11124 1726882361.30695: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30720: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 11124 1726882361.30724: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30786: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30845: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 11124 1726882361.30879: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30901: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py <<< 11124 1726882361.30904: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.30951: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31015: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 11124 1726882361.31060: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31115: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 11124 1726882361.31118: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31329: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31546: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available <<< 11124 1726882361.31597: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31668: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 11124 1726882361.31672: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31684: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31716: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 11124 1726882361.31756: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31790: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 11124 1726882361.31796: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31811: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31851: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 11124 1726882361.31853: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.31924: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32024: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 11124 1726882361.32027: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32029: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32031: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 11124 1726882361.32033: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32067: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32116: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 11124 1726882361.32119: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32133: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32158: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32194: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32237: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32291: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32369: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 11124 1726882361.32373: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 11124 1726882361.32418: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32471: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py <<< 11124 1726882361.32474: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32625: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32790: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 11124 1726882361.32793: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32831: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32881: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 11124 1726882361.32888: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32917: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.32963: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available <<< 11124 1726882361.33035: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.33110: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py <<< 11124 1726882361.33113: stdout chunk (state=3): >>>import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 11124 1726882361.33189: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.33275: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py <<< 11124 1726882361.33278: stdout chunk (state=3): >>>import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 11124 1726882361.33339: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882361.33525: stdout chunk (state=3): >>>import 'gc' # <<< 11124 1726882361.34378: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 11124 1726882361.34406: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 11124 1726882361.34409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 11124 1726882361.34457: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882361.34461: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0294d90> <<< 11124 1726882361.34464: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0294c10> <<< 11124 1726882361.34521: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e00ad850> <<< 11124 1726882361.37738: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 11124 1726882361.37766: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0294a30> <<< 11124 1726882361.37769: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 11124 1726882361.37801: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 11124 1726882361.37812: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0243eb0> <<< 11124 1726882361.37883: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882361.37930: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py <<< 11124 1726882361.37934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e00a6310> <<< 11124 1726882361.37936: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e023f700> <<< 11124 1726882361.38208: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 11124 1726882361.38229: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11124 1726882361.62201: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective<<< 11124 1726882361.62207: stdout chunk (state=3): >>>_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "41", "epoch": "1726882361", "epoch_int": "1726882361", "date": "2024-09-20", "time": "21:32:41", "iso8601_micro": "2024-09-21T01:32:41.360625Z", "iso8601": "2024-09-21T01:32:41Z", "iso8601_basic": "20240920T213241360625", "iso8601_basic_short": "20240920T213241", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.54, "5m": 0.31, "15m": 0.15}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2820, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 712, "free": 2820}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "<<< 11124 1726882361.62271: stdout chunk (state=3): >>>sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 519, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264240586752, "block_size": 4096, "block_total": 65519355, "block_available": 64511862, "block_used": 1007493, "inode_total": 131071472, "inode_available": 130998723, "inode_used": 72749, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11124 1726882361.62785: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks <<< 11124 1726882361.62792: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs <<< 11124 1726882361.62801: stdout chunk (state=3): >>># cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale <<< 11124 1726882361.62809: stdout chunk (state=3): >>># cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types <<< 11124 1726882361.62845: stdout chunk (state=3): >>># cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math <<< 11124 1726882361.62854: stdout chunk (state=3): >>># cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib <<< 11124 1726882361.62904: stdout chunk (state=3): >>># cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid <<< 11124 1726882361.62909: stdout chunk (state=3): >>># cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text <<< 11124 1726882361.62916: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections <<< 11124 1726882361.62939: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle <<< 11124 1726882361.62989: stdout chunk (state=3): >>># cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser <<< 11124 1726882361.63039: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.<<< 11124 1726882361.63044: stdout chunk (state=3): >>>system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11124 1726882361.63303: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11124 1726882361.63337: stdout chunk (state=3): >>># destroy importlib.util <<< 11124 1726882361.63340: stdout chunk (state=3): >>># destroy importlib.abc # destroy importlib.machinery <<< 11124 1726882361.63382: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 11124 1726882361.63388: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 11124 1726882361.63409: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 11124 1726882361.63432: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 11124 1726882361.63491: stdout chunk (state=3): >>># destroy selinux <<< 11124 1726882361.63497: stdout chunk (state=3): >>># destroy distro # destroy logging # destroy argparse <<< 11124 1726882361.63537: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 11124 1726882361.63541: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 11124 1726882361.63561: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 11124 1726882361.63624: stdout chunk (state=3): >>># destroy shlex <<< 11124 1726882361.63632: stdout chunk (state=3): >>># destroy datetime <<< 11124 1726882361.63635: stdout chunk (state=3): >>># destroy base64 <<< 11124 1726882361.63638: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 11124 1726882361.63674: stdout chunk (state=3): >>># destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection <<< 11124 1726882361.63677: stdout chunk (state=3): >>># destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 11124 1726882361.63858: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios <<< 11124 1726882361.63865: stdout chunk (state=3): >>># cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux <<< 11124 1726882361.63872: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 11124 1726882361.63876: stdout chunk (state=3): >>># destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 11124 1726882361.63880: stdout chunk (state=3): >>># destroy subprocess <<< 11124 1726882361.63886: stdout chunk (state=3): >>># cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 11124 1726882361.63890: stdout chunk (state=3): >>># cleanup[3] wiping shutil <<< 11124 1726882361.63894: stdout chunk (state=3): >>># destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno <<< 11124 1726882361.63897: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 11124 1726882361.63903: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile <<< 11124 1726882361.63905: stdout chunk (state=3): >>># destroy copyreg # cleanup[3] wiping functools <<< 11124 1726882361.63937: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp <<< 11124 1726882361.63942: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11124 1726882361.63979: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle <<< 11124 1726882361.63990: stdout chunk (state=3): >>># destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 11124 1726882361.64146: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 11124 1726882361.64182: stdout chunk (state=3): >>># destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath <<< 11124 1726882361.64211: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select <<< 11124 1726882361.64225: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 11124 1726882361.64278: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 11124 1726882361.64561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882361.64640: stderr chunk (state=3): >>><<< 11124 1726882361.64646: stdout chunk (state=3): >>><<< 11124 1726882361.64904: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e16d8dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d3a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e16d8b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e16d8ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1634190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1634220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1657850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1634940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1695880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e162dd90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1657d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e167d970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13d4eb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13d6f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13cc610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13d2640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13d4370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e12b9dc0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12b98b0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12b9eb0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12b9f70> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12b9e80> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13aed30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13a7610> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13bb670> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13dae20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e12cbc70> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13ae250> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e13bb280> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13e09d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cbfa0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cbd90> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cbd00> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e129e370> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e129e460> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12d3fa0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cda30> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cd490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11d21c0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1289c70> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cdeb0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e13e0040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11e4af0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11e4e20> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11f6730> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11f6c70> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e118e3a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11e4f10> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e119f280> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11f65b0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e119f340> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cb9d0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11ba6a0> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11ba970> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11ba760> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11ba850> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11baca0> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e11c71f0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11ba8e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11aea30> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e12cb5b0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e11baa90> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f36e10e4670> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f787f0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e1009760> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009640> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009370> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009490> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009190> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e1009400> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e10097c0> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fe27c0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0fe2b50> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0fe29a0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e09c74f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1002d30> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1009520> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1002190> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1033a90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fd6190> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fd6790> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e09ccd00> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0fd66a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1057d30> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f599a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1062e50> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f690d0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1062e20> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e1069220> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f69100> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e102db80> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e1062ac0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e1062d00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e10e4820> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f650d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f5b370> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f65d00> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f656a0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f66130> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f768b0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fa3910> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05c96a0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0fe07f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05ced90> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0f940a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0597070> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f9d160> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0f9acd0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05cebb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0349a60> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05aa6d0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e05aaaf0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e058f250> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e058fa30> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05de460> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05de910> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e05dbd00> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05dbd60> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05db250> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e03b1f70> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05f34c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e05de310> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e02bfca0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e02bffd0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e02bd370> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e030cbb0> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0243160> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e02432b0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_ztoyc9ai/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f36e0294d90> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0294c10> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e00ad850> # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0294a30> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e0243eb0> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e00a6310> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f36e023f700> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "41", "epoch": "1726882361", "epoch_int": "1726882361", "date": "2024-09-20", "time": "21:32:41", "iso8601_micro": "2024-09-21T01:32:41.360625Z", "iso8601": "2024-09-21T01:32:41Z", "iso8601_basic": "20240920T213241360625", "iso8601_basic_short": "20240920T213241", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_loadavg": {"1m": 0.54, "5m": 0.31, "15m": 0.15}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2820, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 712, "free": 2820}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 519, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264240586752, "block_size": 4096, "block_total": 65519355, "block_available": 64511862, "block_used": 1007493, "inode_total": 131071472, "inode_available": 130998723, "inode_used": 72749, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11124 1726882361.66592: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882361.66595: _low_level_execute_command(): starting 11124 1726882361.66598: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882359.8898108-11159-921097534949/ > /dev/null 2>&1 && sleep 0' 11124 1726882361.68605: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882361.68609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882361.68650: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882361.68655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882361.68659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882361.68876: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882361.68933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882361.69199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882361.69206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882361.69308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882361.71171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882361.71234: stderr chunk (state=3): >>><<< 11124 1726882361.71237: stdout chunk (state=3): >>><<< 11124 1726882361.71571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882361.71574: handler run complete 11124 1726882361.71577: variable 'ansible_facts' from source: unknown 11124 1726882361.71579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882361.73885: variable 'ansible_facts' from source: unknown 11124 1726882361.74150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882361.74626: attempt loop complete, returning result 11124 1726882361.74637: _execute() done 11124 1726882361.74720: dumping result to json 11124 1726882361.74754: done dumping result, returning 11124 1726882361.74780: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-8362-0f62-0000000000cd] 11124 1726882361.74831: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000cd ok: [managed_node1] 11124 1726882361.75936: no more pending results, returning what we have 11124 1726882361.75939: results queue empty 11124 1726882361.75940: checking for any_errors_fatal 11124 1726882361.75942: done checking for any_errors_fatal 11124 1726882361.75942: checking for max_fail_percentage 11124 1726882361.75944: done checking for max_fail_percentage 11124 1726882361.75945: checking to see if all hosts have failed and the running result is not ok 11124 1726882361.75946: done checking to see if all hosts have failed 11124 1726882361.75947: getting the remaining hosts for this loop 11124 1726882361.75948: done getting the remaining hosts for this loop 11124 1726882361.75953: getting the next task for host managed_node1 11124 1726882361.75960: done getting next task for host managed_node1 11124 1726882361.75962: ^ task is: TASK: meta (flush_handlers) 11124 1726882361.75969: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882361.75974: getting variables 11124 1726882361.75976: in VariableManager get_vars() 11124 1726882361.76002: Calling all_inventory to load vars for managed_node1 11124 1726882361.76005: Calling groups_inventory to load vars for managed_node1 11124 1726882361.76009: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882361.76021: Calling all_plugins_play to load vars for managed_node1 11124 1726882361.76023: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882361.76026: Calling groups_plugins_play to load vars for managed_node1 11124 1726882361.76210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882361.76681: done with get_vars() 11124 1726882361.76692: done getting variables 11124 1726882361.76989: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000cd 11124 1726882361.76992: WORKER PROCESS EXITING 11124 1726882361.77035: in VariableManager get_vars() 11124 1726882361.77044: Calling all_inventory to load vars for managed_node1 11124 1726882361.77046: Calling groups_inventory to load vars for managed_node1 11124 1726882361.77048: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882361.77053: Calling all_plugins_play to load vars for managed_node1 11124 1726882361.77269: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882361.77279: Calling groups_plugins_play to load vars for managed_node1 11124 1726882361.77801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882361.78440: done with get_vars() 11124 1726882361.78455: done queuing things up, now waiting for results queue to drain 11124 1726882361.78458: results queue empty 11124 1726882361.78458: checking for any_errors_fatal 11124 1726882361.78461: done checking for any_errors_fatal 11124 1726882361.78461: checking for max_fail_percentage 11124 1726882361.78751: done checking for max_fail_percentage 11124 1726882361.78752: checking to see if all hosts have failed and the running result is not ok 11124 1726882361.78753: done checking to see if all hosts have failed 11124 1726882361.78754: getting the remaining hosts for this loop 11124 1726882361.78755: done getting the remaining hosts for this loop 11124 1726882361.78758: getting the next task for host managed_node1 11124 1726882361.78763: done getting next task for host managed_node1 11124 1726882361.78768: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11124 1726882361.78770: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882361.78772: getting variables 11124 1726882361.78773: in VariableManager get_vars() 11124 1726882361.78783: Calling all_inventory to load vars for managed_node1 11124 1726882361.78785: Calling groups_inventory to load vars for managed_node1 11124 1726882361.78787: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882361.78906: Calling all_plugins_play to load vars for managed_node1 11124 1726882361.78909: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882361.78912: Calling groups_plugins_play to load vars for managed_node1 11124 1726882361.79266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882361.79887: done with get_vars() 11124 1726882361.79896: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:11 Friday 20 September 2024 21:32:41 -0400 (0:00:02.004) 0:00:02.042 ****** 11124 1726882361.79978: entering _queue_task() for managed_node1/include_tasks 11124 1726882361.79980: Creating lock for include_tasks 11124 1726882361.80780: worker is 1 (out of 1 available) 11124 1726882361.80791: exiting _queue_task() for managed_node1/include_tasks 11124 1726882361.80800: done queuing things up, now waiting for results queue to drain 11124 1726882361.80802: waiting for pending results... 11124 1726882361.81656: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 11124 1726882361.81869: in run() - task 0e448fcc-3ce9-8362-0f62-000000000006 11124 1726882361.81886: variable 'ansible_search_path' from source: unknown 11124 1726882361.81927: calling self._execute() 11124 1726882361.82115: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882361.82128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882361.82166: variable 'omit' from source: magic vars 11124 1726882361.82258: _execute() done 11124 1726882361.82378: dumping result to json 11124 1726882361.82385: done dumping result, returning 11124 1726882361.82395: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0e448fcc-3ce9-8362-0f62-000000000006] 11124 1726882361.82405: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000006 11124 1726882361.82626: no more pending results, returning what we have 11124 1726882361.82632: in VariableManager get_vars() 11124 1726882361.82669: Calling all_inventory to load vars for managed_node1 11124 1726882361.82673: Calling groups_inventory to load vars for managed_node1 11124 1726882361.82677: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882361.82692: Calling all_plugins_play to load vars for managed_node1 11124 1726882361.82695: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882361.82698: Calling groups_plugins_play to load vars for managed_node1 11124 1726882361.82911: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882361.83132: done with get_vars() 11124 1726882361.83139: variable 'ansible_search_path' from source: unknown 11124 1726882361.83155: we have included files to process 11124 1726882361.83156: generating all_blocks data 11124 1726882361.83158: done generating all_blocks data 11124 1726882361.83159: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11124 1726882361.83160: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11124 1726882361.83162: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11124 1726882361.83893: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000006 11124 1726882361.83896: WORKER PROCESS EXITING 11124 1726882361.84758: in VariableManager get_vars() 11124 1726882361.84775: done with get_vars() 11124 1726882361.84788: done processing included file 11124 1726882361.84790: iterating over new_blocks loaded from include file 11124 1726882361.84792: in VariableManager get_vars() 11124 1726882361.84802: done with get_vars() 11124 1726882361.84803: filtering new block on tags 11124 1726882361.84931: done filtering new block on tags 11124 1726882361.84935: in VariableManager get_vars() 11124 1726882361.84946: done with get_vars() 11124 1726882361.84947: filtering new block on tags 11124 1726882361.84965: done filtering new block on tags 11124 1726882361.84968: in VariableManager get_vars() 11124 1726882361.84978: done with get_vars() 11124 1726882361.84980: filtering new block on tags 11124 1726882361.84994: done filtering new block on tags 11124 1726882361.84996: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 11124 1726882361.85021: extending task lists for all hosts with included blocks 11124 1726882361.85188: done extending task lists 11124 1726882361.85189: done processing included files 11124 1726882361.85190: results queue empty 11124 1726882361.85191: checking for any_errors_fatal 11124 1726882361.85192: done checking for any_errors_fatal 11124 1726882361.85193: checking for max_fail_percentage 11124 1726882361.85194: done checking for max_fail_percentage 11124 1726882361.85195: checking to see if all hosts have failed and the running result is not ok 11124 1726882361.85195: done checking to see if all hosts have failed 11124 1726882361.85196: getting the remaining hosts for this loop 11124 1726882361.85197: done getting the remaining hosts for this loop 11124 1726882361.85200: getting the next task for host managed_node1 11124 1726882361.85203: done getting next task for host managed_node1 11124 1726882361.85205: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11124 1726882361.85208: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882361.85210: getting variables 11124 1726882361.85211: in VariableManager get_vars() 11124 1726882361.85218: Calling all_inventory to load vars for managed_node1 11124 1726882361.85221: Calling groups_inventory to load vars for managed_node1 11124 1726882361.85223: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882361.85228: Calling all_plugins_play to load vars for managed_node1 11124 1726882361.85230: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882361.85233: Calling groups_plugins_play to load vars for managed_node1 11124 1726882361.85587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882361.86004: done with get_vars() 11124 1726882361.86013: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:32:41 -0400 (0:00:00.060) 0:00:02.103 ****** 11124 1726882361.86076: entering _queue_task() for managed_node1/setup 11124 1726882361.86646: worker is 1 (out of 1 available) 11124 1726882361.86775: exiting _queue_task() for managed_node1/setup 11124 1726882361.86789: done queuing things up, now waiting for results queue to drain 11124 1726882361.86791: waiting for pending results... 11124 1726882361.87509: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 11124 1726882361.87738: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000de 11124 1726882361.87875: variable 'ansible_search_path' from source: unknown 11124 1726882361.87884: variable 'ansible_search_path' from source: unknown 11124 1726882361.87926: calling self._execute() 11124 1726882361.88034: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882361.88191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882361.88206: variable 'omit' from source: magic vars 11124 1726882361.89232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882361.93937: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882361.94021: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882361.94187: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882361.94226: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882361.94391: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882361.94589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882361.94624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882361.94653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882361.94803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882361.94827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882361.95128: variable 'ansible_facts' from source: unknown 11124 1726882361.95206: variable 'network_test_required_facts' from source: task vars 11124 1726882361.95382: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11124 1726882361.95394: variable 'omit' from source: magic vars 11124 1726882361.95434: variable 'omit' from source: magic vars 11124 1726882361.95497: variable 'omit' from source: magic vars 11124 1726882361.95587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882361.95696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882361.95719: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882361.95739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882361.95785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882361.95818: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882361.95888: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882361.95898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882361.96109: Set connection var ansible_shell_executable to /bin/sh 11124 1726882361.96124: Set connection var ansible_shell_type to sh 11124 1726882361.96137: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882361.96146: Set connection var ansible_timeout to 10 11124 1726882361.96156: Set connection var ansible_pipelining to False 11124 1726882361.96211: Set connection var ansible_connection to ssh 11124 1726882361.96242: variable 'ansible_shell_executable' from source: unknown 11124 1726882361.96250: variable 'ansible_connection' from source: unknown 11124 1726882361.96259: variable 'ansible_module_compression' from source: unknown 11124 1726882361.96268: variable 'ansible_shell_type' from source: unknown 11124 1726882361.96320: variable 'ansible_shell_executable' from source: unknown 11124 1726882361.96331: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882361.96339: variable 'ansible_pipelining' from source: unknown 11124 1726882361.96345: variable 'ansible_timeout' from source: unknown 11124 1726882361.96352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882361.96614: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882361.96751: variable 'omit' from source: magic vars 11124 1726882361.96765: starting attempt loop 11124 1726882361.96773: running the handler 11124 1726882361.96791: _low_level_execute_command(): starting 11124 1726882361.96802: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882361.99419: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882361.99445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882361.99467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882361.99489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882361.99535: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882361.99549: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882361.99572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882361.99594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882361.99606: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882361.99617: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882361.99629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882361.99643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882361.99662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882361.99678: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882361.99694: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882361.99708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882361.99786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882361.99815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882361.99833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882361.99974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882362.01642: stdout chunk (state=3): >>>/root <<< 11124 1726882362.01834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882362.01838: stdout chunk (state=3): >>><<< 11124 1726882362.01840: stderr chunk (state=3): >>><<< 11124 1726882362.01870: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882362.01965: _low_level_execute_command(): starting 11124 1726882362.01969: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331 `" && echo ansible-tmp-1726882362.018614-11227-196255082411331="` echo /root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331 `" ) && sleep 0' 11124 1726882362.03349: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882362.03495: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.03512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.03532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.03577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.03597: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882362.03613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.03631: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882362.03642: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882362.03651: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882362.03661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.03675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.03688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.03699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.03714: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882362.03728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.03804: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882362.03944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882362.03962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882362.04092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882362.05982: stdout chunk (state=3): >>>ansible-tmp-1726882362.018614-11227-196255082411331=/root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331 <<< 11124 1726882362.06174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882362.06177: stdout chunk (state=3): >>><<< 11124 1726882362.06190: stderr chunk (state=3): >>><<< 11124 1726882362.06373: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882362.018614-11227-196255082411331=/root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882362.06376: variable 'ansible_module_compression' from source: unknown 11124 1726882362.06378: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11124 1726882362.06380: variable 'ansible_facts' from source: unknown 11124 1726882362.06530: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331/AnsiballZ_setup.py 11124 1726882362.07192: Sending initial data 11124 1726882362.07195: Sent initial data (153 bytes) 11124 1726882362.09536: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882362.09553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.09572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.09641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.09687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.09700: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882362.09716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.09738: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882362.09770: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882362.09783: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882362.09795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.09808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.09825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.09860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.09884: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882362.09899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.10093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882362.10118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882362.10135: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882362.10260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882362.12262: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882362.12354: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882362.12452: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpg_asay0s /root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331/AnsiballZ_setup.py <<< 11124 1726882362.12549: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882362.15723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882362.15859: stderr chunk (state=3): >>><<< 11124 1726882362.15863: stdout chunk (state=3): >>><<< 11124 1726882362.15872: done transferring module to remote 11124 1726882362.15875: _low_level_execute_command(): starting 11124 1726882362.15877: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331/ /root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331/AnsiballZ_setup.py && sleep 0' 11124 1726882362.18046: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882362.18066: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.18082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.18107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.18151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.18168: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882362.18184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.18211: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882362.18224: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882362.18235: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882362.18252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.18268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.18283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.18295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.18307: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882362.18325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.18409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882362.18429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882362.18452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882362.18675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11124 1726882362.20576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882362.20580: stdout chunk (state=3): >>><<< 11124 1726882362.20582: stderr chunk (state=3): >>><<< 11124 1726882362.20677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11124 1726882362.20680: _low_level_execute_command(): starting 11124 1726882362.20683: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331/AnsiballZ_setup.py && sleep 0' 11124 1726882362.22045: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.22052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.22199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882362.22203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.22206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882362.22208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.22266: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882362.22406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882362.22409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882362.22528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882362.24479: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 11124 1726882362.24482: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11124 1726882362.24539: stdout chunk (state=3): >>>import '_io' # <<< 11124 1726882362.24542: stdout chunk (state=3): >>>import 'marshal' # <<< 11124 1726882362.24580: stdout chunk (state=3): >>>import 'posix' # <<< 11124 1726882362.24609: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 11124 1726882362.24612: stdout chunk (state=3): >>># installing zipimport hook <<< 11124 1726882362.24647: stdout chunk (state=3): >>>import 'time' # <<< 11124 1726882362.24652: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 11124 1726882362.24704: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882362.24723: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 11124 1726882362.24751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 11124 1726882362.24767: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1133b3dc0> <<< 11124 1726882362.24825: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 11124 1726882362.24833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 11124 1726882362.24838: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1133583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1133b3b20> <<< 11124 1726882362.24876: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 11124 1726882362.24879: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1133b3ac0> <<< 11124 1726882362.24895: stdout chunk (state=3): >>>import '_signal' # <<< 11124 1726882362.24915: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 11124 1726882362.24928: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113358490> <<< 11124 1726882362.24982: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py<<< 11124 1726882362.24986: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 11124 1726882362.25006: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 11124 1726882362.25010: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113358940> <<< 11124 1726882362.25021: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113358670> <<< 11124 1726882362.25060: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 11124 1726882362.25068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 11124 1726882362.25095: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 11124 1726882362.25118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 11124 1726882362.25125: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 11124 1726882362.25139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 11124 1726882362.25172: stdout chunk (state=3): >>>import '_stat' # <<< 11124 1726882362.25175: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11330f190> <<< 11124 1726882362.25195: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 11124 1726882362.25211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 11124 1726882362.25282: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11330f220> <<< 11124 1726882362.25305: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 11124 1726882362.25341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113332850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11330f940> <<< 11124 1726882362.25370: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113370880> <<< 11124 1726882362.25393: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 11124 1726882362.25396: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113308d90> <<< 11124 1726882362.25460: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 11124 1726882362.25466: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113332d90> <<< 11124 1726882362.25510: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113358970> <<< 11124 1726882362.25546: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11124 1726882362.25882: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 11124 1726882362.25912: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 11124 1726882362.25936: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 11124 1726882362.25980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 11124 1726882362.25989: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 11124 1726882362.26006: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132aeeb0> <<< 11124 1726882362.26037: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132b1f40> <<< 11124 1726882362.26063: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 11124 1726882362.26075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 11124 1726882362.26100: stdout chunk (state=3): >>>import '_sre' # <<< 11124 1726882362.26149: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 11124 1726882362.26153: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py <<< 11124 1726882362.26158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 11124 1726882362.26181: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132a7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132ad640> <<< 11124 1726882362.26187: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132ae370> <<< 11124 1726882362.26206: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 11124 1726882362.26278: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 11124 1726882362.26296: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 11124 1726882362.26336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882362.26408: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 11124 1726882362.26413: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd11322fe20> <<< 11124 1726882362.26416: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11322f910> import 'itertools' # <<< 11124 1726882362.26458: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11322ff10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 11124 1726882362.26877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # <<< 11124 1726882362.26882: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11322ffd0> <<< 11124 1726882362.26887: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 11124 1726882362.26889: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132420d0> <<< 11124 1726882362.26906: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113289d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113282670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132b5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd113242cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132892b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1132952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132bb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242d60> <<< 11124 1726882362.26924: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 11124 1726882362.26991: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 11124 1726882362.27103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112f723d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 11124 1726882362.27119: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112f724c0> <<< 11124 1726882362.27239: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113249f40> <<< 11124 1726882362.27274: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113244a90> <<< 11124 1726882362.27302: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113244490> <<< 11124 1726882362.27305: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 11124 1726882362.27323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 11124 1726882362.27430: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112ea6220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112f5d520> <<< 11124 1726882362.27484: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113244f10> <<< 11124 1726882362.27491: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132bb040> <<< 11124 1726882362.27504: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 11124 1726882362.27521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 11124 1726882362.27542: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 11124 1726882362.27574: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112eb8b50> import 'errno' # <<< 11124 1726882362.27595: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112eb8e80> <<< 11124 1726882362.27621: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 11124 1726882362.27628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 11124 1726882362.27652: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 11124 1726882362.27669: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112ec9790> <<< 11124 1726882362.27680: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 11124 1726882362.27782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112ec9cd0> <<< 11124 1726882362.27805: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e57400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112eb8f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 11124 1726882362.27889: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e682e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112ec9610> import 'pwd' # <<< 11124 1726882362.27915: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e683a0> <<< 11124 1726882362.27943: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242a30> <<< 11124 1726882362.27983: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 11124 1726882362.28019: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 11124 1726882362.28049: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e83700> <<< 11124 1726882362.28085: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.28100: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e839d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112e837c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.28110: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e838b0> <<< 11124 1726882362.28174: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 11124 1726882362.28329: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.28355: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e83d00> <<< 11124 1726882362.28378: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e8e250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112e83940> <<< 11124 1726882362.28388: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112e77a90> <<< 11124 1726882362.28416: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242610> <<< 11124 1726882362.28427: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 11124 1726882362.28493: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 11124 1726882362.28523: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112e83af0> <<< 11124 1726882362.28668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 11124 1726882362.28679: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd112da76d0> <<< 11124 1726882362.28966: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip' # zipimport: zlib available <<< 11124 1726882362.29079: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.29097: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available <<< 11124 1726882362.29134: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 11124 1726882362.30329: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.31254: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 11124 1726882362.31265: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112703820> <<< 11124 1726882362.31268: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882362.31293: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 11124 1726882362.31332: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.31335: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112792730> <<< 11124 1726882362.31370: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792610> <<< 11124 1726882362.31407: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792340> <<< 11124 1726882362.31423: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 11124 1726882362.31478: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792460> <<< 11124 1726882362.31485: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792160> import 'atexit' # <<< 11124 1726882362.31511: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127923a0> <<< 11124 1726882362.31526: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 11124 1726882362.31538: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 11124 1726882362.31586: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792790> <<< 11124 1726882362.31604: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 11124 1726882362.31634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 11124 1726882362.31641: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 11124 1726882362.31660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 11124 1726882362.31683: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py <<< 11124 1726882362.31686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 11124 1726882362.31766: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112782820> <<< 11124 1726882362.31802: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112782490> <<< 11124 1726882362.31826: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112782640> <<< 11124 1726882362.31850: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 11124 1726882362.31866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 11124 1726882362.31885: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112688520> <<< 11124 1726882362.31894: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11278dd60> <<< 11124 1726882362.32077: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127924f0> <<< 11124 1726882362.32100: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 11124 1726882362.32111: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11278d1c0> <<< 11124 1726882362.32123: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 11124 1726882362.32172: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 11124 1726882362.32198: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 11124 1726882362.32208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 11124 1726882362.32228: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' <<< 11124 1726882362.32230: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112791b20> <<< 11124 1726882362.32314: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112761160> <<< 11124 1726882362.32321: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112761760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11268ed30> <<< 11124 1726882362.32345: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.32351: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112761670> <<< 11124 1726882362.32369: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127e3d00> <<< 11124 1726882362.32388: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 11124 1726882362.32398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 11124 1726882362.32419: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 11124 1726882362.32458: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 11124 1726882362.32524: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.32542: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126e5a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127ede80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 11124 1726882362.32556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 11124 1726882362.32605: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.32619: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126f30a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127edeb0> <<< 11124 1726882362.32635: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 11124 1726882362.32669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882362.32688: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 11124 1726882362.32699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 11124 1726882362.32758: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127f5250> <<< 11124 1726882362.32883: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1126f30d0> <<< 11124 1726882362.32973: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127f5a60> <<< 11124 1726882362.33002: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127b7b80> <<< 11124 1726882362.33040: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127edcd0> <<< 11124 1726882362.33063: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127e3ee0> <<< 11124 1726882362.33083: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 11124 1726882362.33103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 11124 1726882362.33150: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.33166: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126ef0d0> <<< 11124 1726882362.33326: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.33338: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126e6310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1126efcd0> <<< 11124 1726882362.33368: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126ef670> <<< 11124 1726882362.33388: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1126f0100> # zipimport: zlib available # zipimport: zlib available <<< 11124 1726882362.33412: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 11124 1726882362.33485: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.33567: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11124 1726882362.33588: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available <<< 11124 1726882362.33620: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 11124 1726882362.33631: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.33713: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.33809: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.34278: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.34707: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py <<< 11124 1726882362.34713: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 11124 1726882362.34733: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 11124 1726882362.34736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882362.34793: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd11272e910> <<< 11124 1726882362.34869: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 11124 1726882362.34872: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127339a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11228b640> <<< 11124 1726882362.34933: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 11124 1726882362.34938: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.34957: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/_text.py <<< 11124 1726882362.34971: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.35089: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.35224: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 11124 1726882362.35250: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127697f0> # zipimport: zlib available <<< 11124 1726882362.35634: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.35994: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36052: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36116: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/collections.py <<< 11124 1726882362.36120: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36144: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36181: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 11124 1726882362.36185: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36241: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36334: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/errors.py <<< 11124 1726882362.36337: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36340: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 11124 1726882362.36357: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36390: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36435: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 11124 1726882362.36438: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36607: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36802: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 11124 1726882362.36829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 11124 1726882362.36832: stdout chunk (state=3): >>>import '_ast' # <<< 11124 1726882362.36908: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127af460> <<< 11124 1726882362.36911: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.36962: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37044: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/validation.py <<< 11124 1726882362.37050: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 11124 1726882362.37072: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37086: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37135: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/locale.py <<< 11124 1726882362.37139: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37172: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37207: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37307: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37354: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 11124 1726882362.37387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882362.37455: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.37459: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127220d0> <<< 11124 1726882362.37544: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127331f0> <<< 11124 1726882362.37580: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/process.py <<< 11124 1726882362.37583: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37640: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37687: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37704: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.37769: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 11124 1726882362.38287: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112735bb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127fe070> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127252e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available <<< 11124 1726882362.38358: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11124 1726882362.38384: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.38409: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 11124 1726882362.38421: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.38481: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.38561: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.38563: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.38599: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 11124 1726882362.38744: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.38883: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.38914: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.38972: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py <<< 11124 1726882362.38979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882362.38993: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 11124 1726882362.39018: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py <<< 11124 1726882362.39021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 11124 1726882362.39040: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11223f400> <<< 11124 1726882362.39076: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 11124 1726882362.39083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 11124 1726882362.39095: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 11124 1726882362.39119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 11124 1726882362.39159: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 11124 1726882362.39163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' <<< 11124 1726882362.39167: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11229d9a0> <<< 11124 1726882362.39202: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.39205: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd11229ddf0> <<< 11124 1726882362.39267: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11229a490> <<< 11124 1726882362.39286: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112116040> <<< 11124 1726882362.39314: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120063a0> <<< 11124 1726882362.39320: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120065e0> <<< 11124 1726882362.39340: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 11124 1726882362.39349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 11124 1726882362.39370: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 11124 1726882362.39408: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127216d0> <<< 11124 1726882362.39431: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1122ac730> <<< 11124 1726882362.39438: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 11124 1726882362.39452: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 11124 1726882362.39469: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127215e0> <<< 11124 1726882362.39500: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 11124 1726882362.39510: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 11124 1726882362.39546: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1122563a0> <<< 11124 1726882362.39579: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120659a0> <<< 11124 1726882362.39602: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120064f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py <<< 11124 1726882362.39639: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.39643: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 11124 1726882362.39656: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.39709: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.39771: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py <<< 11124 1726882362.39778: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.39802: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.39873: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 11124 1726882362.39894: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.39897: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available <<< 11124 1726882362.39909: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.39942: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py <<< 11124 1726882362.39945: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.39988: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.40042: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 11124 1726882362.40070: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.40104: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 11124 1726882362.40114: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.40390: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 11124 1726882362.40714: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41078: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py <<< 11124 1726882362.41081: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41118: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41173: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41188: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41229: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py <<< 11124 1726882362.41233: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available <<< 11124 1726882362.41259: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41287: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 11124 1726882362.41341: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41395: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 11124 1726882362.41398: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41417: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41453: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 11124 1726882362.41457: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41485: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41512: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 11124 1726882362.41515: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41582: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41660: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py <<< 11124 1726882362.41666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 11124 1726882362.41692: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120069d0> <<< 11124 1726882362.41697: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 11124 1726882362.41711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 11124 1726882362.41878: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111f84f40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 11124 1726882362.41881: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41926: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.41991: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available <<< 11124 1726882362.42074: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.42162: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 11124 1726882362.42173: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.42201: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.42273: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available <<< 11124 1726882362.42310: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.42369: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 11124 1726882362.42385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 11124 1726882362.42521: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.42524: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd111f7e3a0> <<< 11124 1726882362.42761: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111fcb100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py <<< 11124 1726882362.42767: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.42805: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.42865: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 11124 1726882362.42868: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.42928: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43002: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43092: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43234: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 11124 1726882362.43238: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43262: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43306: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py <<< 11124 1726882362.43310: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43341: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43394: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 11124 1726882362.43397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 11124 1726882362.43472: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882362.43475: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd111f116a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111f11a90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py <<< 11124 1726882362.43478: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11124 1726882362.43480: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 11124 1726882362.43490: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43520: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43562: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 11124 1726882362.43566: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43693: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43869: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 11124 1726882362.43906: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.43986: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.44019: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.44060: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py <<< 11124 1726882362.44065: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available <<< 11124 1726882362.44142: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.44157: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.44276: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.44438: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 11124 1726882362.44504: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.44614: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 11124 1726882362.44620: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.44636: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.44674: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.45105: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.45524: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 11124 1726882362.45528: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.45611: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.45703: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 11124 1726882362.45789: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.45884: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 11124 1726882362.45887: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46010: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46138: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available <<< 11124 1726882362.46167: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py <<< 11124 1726882362.46171: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46199: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46250: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 11124 1726882362.46253: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46334: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46414: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46593: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46780: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 11124 1726882362.46785: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46813: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46842: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py <<< 11124 1726882362.46865: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46869: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46900: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py <<< 11124 1726882362.46903: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.46962: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47031: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 11124 1726882362.47034: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47058: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47074: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 11124 1726882362.47124: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47183: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 11124 1726882362.47187: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47226: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47284: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 11124 1726882362.47288: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47499: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47718: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 11124 1726882362.47721: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47763: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47814: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 11124 1726882362.47851: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47888: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 11124 1726882362.47895: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47914: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47951: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py <<< 11124 1726882362.47954: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.47983: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48022: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 11124 1726882362.48025: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48083: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48186: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 11124 1726882362.48189: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48191: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 11124 1726882362.48193: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48223: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48269: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py <<< 11124 1726882362.48295: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48298: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48308: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48359: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48392: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48456: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48522: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 11124 1726882362.48526: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 11124 1726882362.48537: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48573: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48619: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 11124 1726882362.48787: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48953: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 11124 1726882362.48956: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.48985: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.49036: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py <<< 11124 1726882362.49039: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.49075: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.49123: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 11124 1726882362.49126: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.49186: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.49271: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 11124 1726882362.49278: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.49337: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.49413: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 11124 1726882362.49498: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882362.49660: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py <<< 11124 1726882362.49703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 11124 1726882362.49707: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py <<< 11124 1726882362.49718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 11124 1726882362.49735: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd111ef35e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111ef50d0> <<< 11124 1726882362.49793: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111ef5f10> <<< 11124 1726882362.52283: stdout chunk (state=3): >>>import 'gc' # <<< 11124 1726882362.52819: stdout chunk (state=3): >>> <<< 11124 1726882362.52868: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "42", "epoch": "1726882362", "epoch_int": "1726882362", "date": "2024-09-20", "time": "21:32:42", "iso8601_micro": "2024-09-21T01:32:42.520599Z", "iso8601": "2024-09-21T01:32:42Z", "iso8601_basic": "20240920T213242520599", "iso8601_basic_short": "20240920T213242", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11124 1726882362.53543: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path <<< 11124 1726882362.53568: stdout chunk (state=3): >>># clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 <<< 11124 1726882362.53582: stdout chunk (state=3): >>># cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime <<< 11124 1726882362.53620: stdout chunk (state=3): >>># cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon <<< 11124 1726882362.53661: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser <<< 11124 1726882362.53752: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux <<< 11124 1726882362.53756: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc <<< 11124 1726882362.53979: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11124 1726882362.53987: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 11124 1726882362.54011: stdout chunk (state=3): >>># destroy zipimport <<< 11124 1726882362.54026: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 11124 1726882362.54057: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 11124 1726882362.54068: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 11124 1726882362.54095: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 11124 1726882362.54136: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 11124 1726882362.54186: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 11124 1726882362.54216: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 11124 1726882362.54234: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex <<< 11124 1726882362.54249: stdout chunk (state=3): >>># destroy datetime <<< 11124 1726882362.54258: stdout chunk (state=3): >>># destroy base64 <<< 11124 1726882362.54284: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass <<< 11124 1726882362.54293: stdout chunk (state=3): >>># destroy json <<< 11124 1726882362.54316: stdout chunk (state=3): >>># destroy socket # destroy struct <<< 11124 1726882362.54324: stdout chunk (state=3): >>># destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 11124 1726882362.54378: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios <<< 11124 1726882362.54401: stdout chunk (state=3): >>># cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string<<< 11124 1726882362.54432: stdout chunk (state=3): >>> # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 <<< 11124 1726882362.54468: stdout chunk (state=3): >>># cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch <<< 11124 1726882362.54491: stdout chunk (state=3): >>># cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 11124 1726882362.54521: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types <<< 11124 1726882362.54542: stdout chunk (state=3): >>># cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 11124 1726882362.54583: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11124 1726882362.54621: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle <<< 11124 1726882362.54635: stdout chunk (state=3): >>># destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 11124 1726882362.54827: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath <<< 11124 1726882362.54859: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 11124 1726882362.54882: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves <<< 11124 1726882362.54896: stdout chunk (state=3): >>># destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 11124 1726882362.54937: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 11124 1726882362.55313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882362.55316: stdout chunk (state=3): >>><<< 11124 1726882362.55325: stderr chunk (state=3): >>><<< 11124 1726882362.55501: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1133b3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1133583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1133b3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1133b3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113358490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113358940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113358670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11330f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11330f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113332850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11330f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113370880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113308d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113332d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113358970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132aeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132b1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132a7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132ad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132ae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd11322fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11322f910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11322ff10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11322ffd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132420d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113289d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113282670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132b5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd113242cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132892b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1132952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132bb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112f723d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112f724c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113249f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113244a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113244490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112ea6220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112f5d520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113244f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1132bb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112eb8b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112eb8e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112ec9790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112ec9cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e57400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112eb8f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e682e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112ec9610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e683a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e83700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e839d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112e837c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e838b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e83d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112e8e250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112e83940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112e77a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd113242610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112e83af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fd112da76d0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112703820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112792730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127923a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112792790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112782820> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112782490> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112782640> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112688520> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11278dd60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127924f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11278d1c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112791b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112761160> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112761760> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11268ed30> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd112761670> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127e3d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126e5a00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127ede80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126f30a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127edeb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127f5250> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1126f30d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127f5a60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127b7b80> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127edcd0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127e3ee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126ef0d0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126e6310> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1126efcd0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1126ef670> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1126f0100> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd11272e910> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127339a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11228b640> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127697f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127af460> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127220d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127331f0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112735bb0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127fe070> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127252e0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11223f400> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11229d9a0> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd11229ddf0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd11229a490> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd112116040> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120063a0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120065e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1127216d0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1122ac730> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1127215e0> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd1122563a0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120659a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120064f0> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd1120069d0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111f84f40> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd111f7e3a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111fcb100> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd111f116a0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111f11a90> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_ucv7dcfu/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fd111ef35e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111ef50d0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fd111ef5f10> import 'gc' # {"ansible_facts": {"ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "42", "epoch": "1726882362", "epoch_int": "1726882362", "date": "2024-09-20", "time": "21:32:42", "iso8601_micro": "2024-09-21T01:32:42.520599Z", "iso8601": "2024-09-21T01:32:42Z", "iso8601_basic": "20240920T213242520599", "iso8601_basic_short": "20240920T213242", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 11124 1726882362.56419: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882362.56422: _low_level_execute_command(): starting 11124 1726882362.56425: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882362.018614-11227-196255082411331/ > /dev/null 2>&1 && sleep 0' 11124 1726882362.57102: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882362.57119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.57136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.57158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.57222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.57235: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882362.57258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.57289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882362.57307: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882362.57319: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882362.57332: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.57346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.57369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.57382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.57401: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882362.57415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.57609: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882362.57854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882362.57953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882362.60568: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882362.60655: stderr chunk (state=3): >>><<< 11124 1726882362.60659: stdout chunk (state=3): >>><<< 11124 1726882362.60684: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882362.60690: handler run complete 11124 1726882362.60756: variable 'ansible_facts' from source: unknown 11124 1726882362.60818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882362.60951: variable 'ansible_facts' from source: unknown 11124 1726882362.61027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882362.61083: attempt loop complete, returning result 11124 1726882362.61086: _execute() done 11124 1726882362.61089: dumping result to json 11124 1726882362.61112: done dumping result, returning 11124 1726882362.61120: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0e448fcc-3ce9-8362-0f62-0000000000de] 11124 1726882362.61125: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000de 11124 1726882362.61294: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000de 11124 1726882362.61296: WORKER PROCESS EXITING ok: [managed_node1] 11124 1726882362.61441: no more pending results, returning what we have 11124 1726882362.61445: results queue empty 11124 1726882362.61447: checking for any_errors_fatal 11124 1726882362.61448: done checking for any_errors_fatal 11124 1726882362.61449: checking for max_fail_percentage 11124 1726882362.61450: done checking for max_fail_percentage 11124 1726882362.61451: checking to see if all hosts have failed and the running result is not ok 11124 1726882362.61452: done checking to see if all hosts have failed 11124 1726882362.61453: getting the remaining hosts for this loop 11124 1726882362.61454: done getting the remaining hosts for this loop 11124 1726882362.61459: getting the next task for host managed_node1 11124 1726882362.61488: done getting next task for host managed_node1 11124 1726882362.61491: ^ task is: TASK: Check if system is ostree 11124 1726882362.61494: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882362.61498: getting variables 11124 1726882362.61501: in VariableManager get_vars() 11124 1726882362.61530: Calling all_inventory to load vars for managed_node1 11124 1726882362.61533: Calling groups_inventory to load vars for managed_node1 11124 1726882362.61537: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882362.61548: Calling all_plugins_play to load vars for managed_node1 11124 1726882362.61551: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882362.61554: Calling groups_plugins_play to load vars for managed_node1 11124 1726882362.61752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882362.62000: done with get_vars() 11124 1726882362.62034: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:32:42 -0400 (0:00:00.765) 0:00:02.868 ****** 11124 1726882362.62615: entering _queue_task() for managed_node1/stat 11124 1726882362.63206: worker is 1 (out of 1 available) 11124 1726882362.63216: exiting _queue_task() for managed_node1/stat 11124 1726882362.63227: done queuing things up, now waiting for results queue to drain 11124 1726882362.63228: waiting for pending results... 11124 1726882362.63508: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 11124 1726882362.63624: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000e0 11124 1726882362.63644: variable 'ansible_search_path' from source: unknown 11124 1726882362.63656: variable 'ansible_search_path' from source: unknown 11124 1726882362.63699: calling self._execute() 11124 1726882362.63769: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882362.63783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882362.63794: variable 'omit' from source: magic vars 11124 1726882362.64292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882362.64582: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882362.64630: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882362.64679: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882362.64717: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882362.64810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882362.64842: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882362.64883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882362.64916: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882362.65045: Evaluated conditional (not __network_is_ostree is defined): True 11124 1726882362.65065: variable 'omit' from source: magic vars 11124 1726882362.65120: variable 'omit' from source: magic vars 11124 1726882362.65168: variable 'omit' from source: magic vars 11124 1726882362.65212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882362.65244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882362.65289: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882362.65316: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882362.65331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882362.65377: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882362.65391: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882362.65398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882362.65532: Set connection var ansible_shell_executable to /bin/sh 11124 1726882362.65545: Set connection var ansible_shell_type to sh 11124 1726882362.65559: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882362.65570: Set connection var ansible_timeout to 10 11124 1726882362.65578: Set connection var ansible_pipelining to False 11124 1726882362.65584: Set connection var ansible_connection to ssh 11124 1726882362.65613: variable 'ansible_shell_executable' from source: unknown 11124 1726882362.65624: variable 'ansible_connection' from source: unknown 11124 1726882362.65631: variable 'ansible_module_compression' from source: unknown 11124 1726882362.65637: variable 'ansible_shell_type' from source: unknown 11124 1726882362.65642: variable 'ansible_shell_executable' from source: unknown 11124 1726882362.65649: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882362.65656: variable 'ansible_pipelining' from source: unknown 11124 1726882362.65662: variable 'ansible_timeout' from source: unknown 11124 1726882362.65681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882362.65850: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882362.65867: variable 'omit' from source: magic vars 11124 1726882362.65876: starting attempt loop 11124 1726882362.65882: running the handler 11124 1726882362.65908: _low_level_execute_command(): starting 11124 1726882362.65921: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882362.66953: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882362.66975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.66996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.67016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.67060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.67082: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882362.67099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.67124: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882362.67138: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882362.67153: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882362.67171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.67192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.67207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.67223: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.67232: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882362.67243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.67328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882362.67343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882362.67358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882362.67511: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882362.69450: stdout chunk (state=3): >>>/root <<< 11124 1726882362.69686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882362.69690: stdout chunk (state=3): >>><<< 11124 1726882362.69694: stderr chunk (state=3): >>><<< 11124 1726882362.69771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882362.69783: _low_level_execute_command(): starting 11124 1726882362.69786: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092 `" && echo ansible-tmp-1726882362.6972234-11259-205991503740092="` echo /root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092 `" ) && sleep 0' 11124 1726882362.70474: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882362.70489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.70512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.70531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.70578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.70591: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882362.70614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.70633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882362.70645: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882362.70662: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882362.70678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.70692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.70709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.70729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.70742: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882362.70759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.70843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882362.70870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882362.70887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882362.71076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882362.73444: stdout chunk (state=3): >>>ansible-tmp-1726882362.6972234-11259-205991503740092=/root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092 <<< 11124 1726882362.73679: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882362.73682: stdout chunk (state=3): >>><<< 11124 1726882362.73684: stderr chunk (state=3): >>><<< 11124 1726882362.73771: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882362.6972234-11259-205991503740092=/root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882362.73775: variable 'ansible_module_compression' from source: unknown 11124 1726882362.73874: ANSIBALLZ: Using lock for stat 11124 1726882362.73877: ANSIBALLZ: Acquiring lock 11124 1726882362.73880: ANSIBALLZ: Lock acquired: 139628948389760 11124 1726882362.73882: ANSIBALLZ: Creating module 11124 1726882362.87352: ANSIBALLZ: Writing module into payload 11124 1726882362.87742: ANSIBALLZ: Writing module 11124 1726882362.87813: ANSIBALLZ: Renaming module 11124 1726882362.87833: ANSIBALLZ: Done creating module 11124 1726882362.87860: variable 'ansible_facts' from source: unknown 11124 1726882362.87946: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092/AnsiballZ_stat.py 11124 1726882362.88301: Sending initial data 11124 1726882362.88303: Sent initial data (153 bytes) 11124 1726882362.89380: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882362.89393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.89406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.89424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.89479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.89493: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882362.89508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.89527: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882362.89541: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882362.89556: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882362.89578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882362.89599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.89615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.89627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882362.89639: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882362.89656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.89742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882362.89769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882362.89787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882362.89935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882362.92528: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882362.92622: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882362.92713: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp3omseyr3 /root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092/AnsiballZ_stat.py <<< 11124 1726882362.92804: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882362.94182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882362.94372: stderr chunk (state=3): >>><<< 11124 1726882362.94376: stdout chunk (state=3): >>><<< 11124 1726882362.94378: done transferring module to remote 11124 1726882362.94468: _low_level_execute_command(): starting 11124 1726882362.94471: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092/ /root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092/AnsiballZ_stat.py && sleep 0' 11124 1726882362.95692: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882362.95696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882362.95727: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.95731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882362.95733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882362.95807: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882362.95811: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882362.95928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11124 1726882362.98503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882362.98576: stderr chunk (state=3): >>><<< 11124 1726882362.98579: stdout chunk (state=3): >>><<< 11124 1726882362.98677: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11124 1726882362.98682: _low_level_execute_command(): starting 11124 1726882362.98684: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092/AnsiballZ_stat.py && sleep 0' 11124 1726882363.00761: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882363.00896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.00910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.00924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.00966: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.00978: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882363.00985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.01007: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882363.01016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882363.01023: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882363.01030: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.01040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.01054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.01063: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.01077: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882363.01086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.01273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882363.01292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882363.01304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882363.01442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882363.03383: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 11124 1726882363.03395: stdout chunk (state=3): >>>import '_thread' # <<< 11124 1726882363.03403: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 11124 1726882363.03474: stdout chunk (state=3): >>>import '_io' # <<< 11124 1726882363.03482: stdout chunk (state=3): >>>import 'marshal' # <<< 11124 1726882363.03518: stdout chunk (state=3): >>>import 'posix' # <<< 11124 1726882363.03548: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 11124 1726882363.03555: stdout chunk (state=3): >>># installing zipimport hook <<< 11124 1726882363.03594: stdout chunk (state=3): >>>import 'time' # <<< 11124 1726882363.03608: stdout chunk (state=3): >>>import 'zipimport' # <<< 11124 1726882363.03612: stdout chunk (state=3): >>># installed zipimport hook <<< 11124 1726882363.03666: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py <<< 11124 1726882363.03674: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882363.03698: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 11124 1726882363.03722: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 11124 1726882363.03725: stdout chunk (state=3): >>>import '_codecs' # <<< 11124 1726882363.03761: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88403b3dc0> <<< 11124 1726882363.03805: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 11124 1726882363.03831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 11124 1726882363.03835: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88403583a0> <<< 11124 1726882363.03841: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88403b3b20> <<< 11124 1726882363.03860: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py <<< 11124 1726882363.03870: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 11124 1726882363.03883: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88403b3ac0> <<< 11124 1726882363.03913: stdout chunk (state=3): >>>import '_signal' # <<< 11124 1726882363.03928: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 11124 1726882363.03943: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 11124 1726882363.03958: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840358490> <<< 11124 1726882363.03985: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 11124 1726882363.03994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 11124 1726882363.04013: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 11124 1726882363.04022: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 11124 1726882363.04043: stdout chunk (state=3): >>>import '_abc' # <<< 11124 1726882363.04054: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840358940><<< 11124 1726882363.04065: stdout chunk (state=3): >>> <<< 11124 1726882363.04084: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840358670><<< 11124 1726882363.04090: stdout chunk (state=3): >>> <<< 11124 1726882363.04120: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 11124 1726882363.04136: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 11124 1726882363.04161: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 11124 1726882363.04188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 11124 1726882363.04202: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 11124 1726882363.04227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 11124 1726882363.04253: stdout chunk (state=3): >>>import '_stat' # <<< 11124 1726882363.04264: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884030f190> <<< 11124 1726882363.04279: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 11124 1726882363.04294: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 11124 1726882363.04377: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884030f220> <<< 11124 1726882363.04407: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 11124 1726882363.04415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 11124 1726882363.04432: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840332850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884030f940> <<< 11124 1726882363.04481: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840370880> <<< 11124 1726882363.04489: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' <<< 11124 1726882363.04503: stdout chunk (state=3): >>>import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840307d90> <<< 11124 1726882363.04840: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840332d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840358970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 11124 1726882363.05076: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py <<< 11124 1726882363.05081: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 11124 1726882363.05084: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 11124 1726882363.05088: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 11124 1726882363.05092: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 11124 1726882363.05095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 11124 1726882363.05100: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402aeeb0> <<< 11124 1726882363.05102: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402b1f40> <<< 11124 1726882363.05104: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 11124 1726882363.05106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 11124 1726882363.05111: stdout chunk (state=3): >>>import '_sre' # <<< 11124 1726882363.05115: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 11124 1726882363.05117: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 11124 1726882363.05376: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402a7610> <<< 11124 1726882363.05383: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402ad640> <<< 11124 1726882363.05386: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402ae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 11124 1726882363.05390: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 11124 1726882363.05392: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 11124 1726882363.05397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882363.05400: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 11124 1726882363.05403: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f884022fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884022f910> <<< 11124 1726882363.05405: stdout chunk (state=3): >>>import 'itertools' # <<< 11124 1726882363.05407: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884022ff10> <<< 11124 1726882363.05408: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 11124 1726882363.05432: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 11124 1726882363.05450: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884022ffd0> <<< 11124 1726882363.05487: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' <<< 11124 1726882363.05495: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402420d0> import '_collections' # <<< 11124 1726882363.05570: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840289d90> import '_functools' # <<< 11124 1726882363.05576: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840282670> <<< 11124 1726882363.05620: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 11124 1726882363.05652: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402956d0> <<< 11124 1726882363.05658: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402b5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 11124 1726882363.05689: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8840242cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402892b0> <<< 11124 1726882363.05733: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882363.05739: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88402952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402bb9d0> <<< 11124 1726882363.05762: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 11124 1726882363.05775: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 11124 1726882363.05789: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882363.05822: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 11124 1726882363.05840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242eb0> <<< 11124 1726882363.05845: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242df0> <<< 11124 1726882363.05875: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242d60> <<< 11124 1726882363.05904: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 11124 1726882363.05959: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 11124 1726882363.05997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 11124 1726882363.06027: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fffa3d0> <<< 11124 1726882363.06044: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 11124 1726882363.06055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 11124 1726882363.06090: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fffa4c0> <<< 11124 1726882363.06210: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840249f40> <<< 11124 1726882363.06253: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840244a90> <<< 11124 1726882363.06259: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840244490> <<< 11124 1726882363.06288: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 11124 1726882363.06330: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 11124 1726882363.06335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 11124 1726882363.06367: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 11124 1726882363.06375: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff23220> <<< 11124 1726882363.06404: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ffe5520> <<< 11124 1726882363.06453: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840244f10> <<< 11124 1726882363.06462: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402bb040> <<< 11124 1726882363.06479: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 11124 1726882363.06517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 11124 1726882363.06531: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 11124 1726882363.06541: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff35b50> import 'errno' # <<< 11124 1726882363.06586: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff35e80> <<< 11124 1726882363.06598: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 11124 1726882363.06632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 11124 1726882363.06638: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff46790> <<< 11124 1726882363.06661: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 11124 1726882363.06690: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 11124 1726882363.06724: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff46cd0> <<< 11124 1726882363.06758: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fed4400> <<< 11124 1726882363.06769: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff35f70> <<< 11124 1726882363.06789: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 11124 1726882363.06798: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 11124 1726882363.06846: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fee52e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff46610> <<< 11124 1726882363.06851: stdout chunk (state=3): >>>import 'pwd' # <<< 11124 1726882363.06880: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fee53a0> <<< 11124 1726882363.06922: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242a30> <<< 11124 1726882363.06934: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 11124 1726882363.06958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 11124 1726882363.06976: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 11124 1726882363.06990: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 11124 1726882363.07019: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff00700> <<< 11124 1726882363.07042: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 11124 1726882363.07077: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff009d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff007c0> <<< 11124 1726882363.07097: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff008b0> <<< 11124 1726882363.07126: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 11124 1726882363.07318: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff00d00> <<< 11124 1726882363.07358: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff0b250> <<< 11124 1726882363.07366: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff00940> <<< 11124 1726882363.07378: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fef4a90> <<< 11124 1726882363.07399: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242610> <<< 11124 1726882363.07502: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 11124 1726882363.07509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 11124 1726882363.07861: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff00af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f883fe1b6d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip' # zipimport: zlib available <<< 11124 1726882363.07922: stdout chunk (state=3): >>># zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 11124 1726882363.07951: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/__init__.py <<< 11124 1726882363.08377: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.09185: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.10123: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f7ed820> <<< 11124 1726882363.10154: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882363.10171: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 11124 1726882363.10197: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 11124 1726882363.10222: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fda8730> <<< 11124 1726882363.10264: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8610> <<< 11124 1726882363.10291: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8340> <<< 11124 1726882363.10311: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 11124 1726882363.10364: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8160> <<< 11124 1726882363.10371: stdout chunk (state=3): >>>import 'atexit' # <<< 11124 1726882363.10394: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fda83a0> <<< 11124 1726882363.10412: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 11124 1726882363.10443: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 11124 1726882363.10481: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8790> <<< 11124 1726882363.10498: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 11124 1726882363.10509: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 11124 1726882363.10530: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 11124 1726882363.10552: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 11124 1726882363.10578: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 11124 1726882363.10646: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f76d7f0> <<< 11124 1726882363.10682: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f76db80> <<< 11124 1726882363.10715: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f76d9d0> <<< 11124 1726882363.10726: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 11124 1726882363.10761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 11124 1726882363.10793: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f78caf0> <<< 11124 1726882363.10806: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda2d60> <<< 11124 1726882363.10974: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda84f0> <<< 11124 1726882363.10994: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 11124 1726882363.11017: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda21c0> <<< 11124 1726882363.11040: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 11124 1726882363.11050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 11124 1726882363.11068: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 11124 1726882363.11096: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py <<< 11124 1726882363.11108: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 11124 1726882363.11128: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f7e9b20> <<< 11124 1726882363.11227: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd4beb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd4b8b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f7862e0> <<< 11124 1726882363.11256: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fd4b9a0> <<< 11124 1726882363.11286: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd79d00> <<< 11124 1726882363.11305: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 11124 1726882363.11312: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 11124 1726882363.11333: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 11124 1726882363.11360: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 11124 1726882363.11438: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f74ea00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd81e80> <<< 11124 1726882363.11453: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 11124 1726882363.11467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 11124 1726882363.11524: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' <<< 11124 1726882363.11534: stdout chunk (state=3): >>>import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f75d0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd81eb0> <<< 11124 1726882363.11545: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 11124 1726882363.11584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882363.11610: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 11124 1726882363.11617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 11124 1726882363.11677: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd4e730> <<< 11124 1726882363.11803: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f75d0d0> <<< 11124 1726882363.11892: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f75a550> <<< 11124 1726882363.11922: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f75a610> <<< 11124 1726882363.11963: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f759c40> <<< 11124 1726882363.11981: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd79ee0> <<< 11124 1726882363.11997: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 11124 1726882363.12006: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 11124 1726882363.12018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 11124 1726882363.12076: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f7deb50> <<< 11124 1726882363.12262: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f7dc940> <<< 11124 1726882363.12282: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f750820> <<< 11124 1726882363.12310: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f7de5b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd42af0> <<< 11124 1726882363.12321: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11124 1726882363.12326: stdout chunk (state=3): >>>import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py <<< 11124 1726882363.12338: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.12419: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.12495: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.12510: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 11124 1726882363.12531: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11124 1726882363.12541: stdout chunk (state=3): >>>import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available <<< 11124 1726882363.12647: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.12744: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.13203: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.13662: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py <<< 11124 1726882363.13685: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 11124 1726882363.13696: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 11124 1726882363.13703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882363.13761: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f32edf0> <<< 11124 1726882363.13832: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f72a5b0> <<< 11124 1726882363.13843: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f71bdf0> <<< 11124 1726882363.13894: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py <<< 11124 1726882363.13906: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.13915: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.13933: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 11124 1726882363.14061: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.14193: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 11124 1726882363.14213: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f7d39d0> <<< 11124 1726882363.14221: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.14662: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.14978: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15029: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15096: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 11124 1726882363.15130: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15156: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py <<< 11124 1726882363.15169: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15227: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15300: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 11124 1726882363.15323: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py <<< 11124 1726882363.15331: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15365: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15404: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 11124 1726882363.15410: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15597: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15784: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 11124 1726882363.15814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' <<< 11124 1726882363.15819: stdout chunk (state=3): >>>import '_ast' # <<< 11124 1726882363.15890: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f301e50> <<< 11124 1726882363.15896: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.15955: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16024: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py <<< 11124 1726882363.16033: stdout chunk (state=3): >>>import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py <<< 11124 1726882363.16045: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16082: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16121: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/locale.py <<< 11124 1726882363.16127: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16162: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16200: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16294: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16353: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 11124 1726882363.16375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 11124 1726882363.16450: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fd93910> <<< 11124 1726882363.16476: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f301be0> <<< 11124 1726882363.16511: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/process.py <<< 11124 1726882363.16516: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16637: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16686: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16710: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.16745: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 11124 1726882363.16759: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 11124 1726882363.16773: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 11124 1726882363.16812: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 11124 1726882363.16826: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 11124 1726882363.16843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 11124 1726882363.16933: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f2c3c70> <<< 11124 1726882363.16972: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f71d670> <<< 11124 1726882363.17024: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f71c850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py <<< 11124 1726882363.17045: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.17063: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.17084: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py <<< 11124 1726882363.17090: stdout chunk (state=3): >>>import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 11124 1726882363.17155: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/basic.py <<< 11124 1726882363.17175: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.17186: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 11124 1726882363.17192: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.17306: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.17476: stdout chunk (state=3): >>># zipimport: zlib available <<< 11124 1726882363.17632: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 11124 1726882363.17877: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 <<< 11124 1726882363.17927: stdout chunk (state=3): >>># clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg <<< 11124 1726882363.17970: stdout chunk (state=3): >>># cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil <<< 11124 1726882363.18019: stdout chunk (state=3): >>># cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache <<< 11124 1726882363.18051: stdout chunk (state=3): >>># cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text<<< 11124 1726882363.18093: stdout chunk (state=3): >>> # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast <<< 11124 1726882363.18102: stdout chunk (state=3): >>># destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 11124 1726882363.18322: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11124 1726882363.18352: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 11124 1726882363.18355: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma <<< 11124 1726882363.18397: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 11124 1726882363.18410: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy array # destroy datetime <<< 11124 1726882363.18426: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 11124 1726882363.18463: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 11124 1726882363.18491: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid <<< 11124 1726882363.18518: stdout chunk (state=3): >>># cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 <<< 11124 1726882363.18540: stdout chunk (state=3): >>># cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings <<< 11124 1726882363.18563: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools <<< 11124 1726882363.18601: stdout chunk (state=3): >>># cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath <<< 11124 1726882363.18619: stdout chunk (state=3): >>># cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal <<< 11124 1726882363.18636: stdout chunk (state=3): >>># cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader <<< 11124 1726882363.18642: stdout chunk (state=3): >>># destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 11124 1726882363.18812: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq <<< 11124 1726882363.18834: stdout chunk (state=3): >>># destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 11124 1726882363.18841: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 11124 1726882363.18867: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 11124 1726882363.19265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882363.19268: stderr chunk (state=3): >>><<< 11124 1726882363.19271: stdout chunk (state=3): >>><<< 11124 1726882363.19361: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88403b3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88403583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88403b3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88403b3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840358490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840358940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840358670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884030f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884030f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840332850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884030f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840370880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840307d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840332d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840358970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402aeeb0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402b1f40> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402a7610> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402ad640> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402ae370> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f884022fe20> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884022f910> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884022ff10> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f884022ffd0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402420d0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840289d90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840282670> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402956d0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402b5e20> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8840242cd0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402892b0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f88402952e0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402bb9d0> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242eb0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242df0> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242d60> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fffa3d0> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fffa4c0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840249f40> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840244a90> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840244490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff23220> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ffe5520> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840244f10> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f88402bb040> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff35b50> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff35e80> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff46790> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff46cd0> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fed4400> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff35f70> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fee52e0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff46610> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fee53a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242a30> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff00700> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff009d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff007c0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff008b0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff00d00> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883ff0b250> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff00940> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fef4a90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8840242610> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883ff00af0> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f883fe1b6d0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f7ed820> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fda8730> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8610> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8340> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8460> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8160> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fda83a0> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda8790> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f76d7f0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f76db80> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f76d9d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f78caf0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda2d60> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda84f0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fda21c0> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f7e9b20> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd4beb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd4b8b0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f7862e0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fd4b9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd79d00> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f74ea00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd81e80> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f75d0a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd81eb0> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd4e730> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f75d0d0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f75a550> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f75a610> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f759c40> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd79ee0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f7deb50> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f7dc940> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f750820> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f7de5b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883fd42af0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883f32edf0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f72a5b0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f71bdf0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f7d39d0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f301e50> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f883fd93910> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f301be0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f2c3c70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f71d670> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f883f71c850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_0fwxx3m0/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 11124 1726882363.20614: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882363.20617: _low_level_execute_command(): starting 11124 1726882363.20620: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882362.6972234-11259-205991503740092/ > /dev/null 2>&1 && sleep 0' 11124 1726882363.22480: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882363.22604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.22614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.22628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.22668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.22675: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882363.22686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.22700: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882363.22711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882363.22717: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882363.22726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.22734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.22747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.22752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.22759: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882363.22770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.22955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882363.22976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882363.22990: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882363.23110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882363.24986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882363.24990: stdout chunk (state=3): >>><<< 11124 1726882363.24996: stderr chunk (state=3): >>><<< 11124 1726882363.25015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882363.25022: handler run complete 11124 1726882363.25044: attempt loop complete, returning result 11124 1726882363.25050: _execute() done 11124 1726882363.25053: dumping result to json 11124 1726882363.25055: done dumping result, returning 11124 1726882363.25061: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0e448fcc-3ce9-8362-0f62-0000000000e0] 11124 1726882363.25067: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000e0 11124 1726882363.25162: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000e0 11124 1726882363.25167: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11124 1726882363.25223: no more pending results, returning what we have 11124 1726882363.25226: results queue empty 11124 1726882363.25227: checking for any_errors_fatal 11124 1726882363.25233: done checking for any_errors_fatal 11124 1726882363.25234: checking for max_fail_percentage 11124 1726882363.25235: done checking for max_fail_percentage 11124 1726882363.25236: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.25237: done checking to see if all hosts have failed 11124 1726882363.25238: getting the remaining hosts for this loop 11124 1726882363.25239: done getting the remaining hosts for this loop 11124 1726882363.25242: getting the next task for host managed_node1 11124 1726882363.25250: done getting next task for host managed_node1 11124 1726882363.25253: ^ task is: TASK: Set flag to indicate system is ostree 11124 1726882363.25256: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.25259: getting variables 11124 1726882363.25261: in VariableManager get_vars() 11124 1726882363.25295: Calling all_inventory to load vars for managed_node1 11124 1726882363.25297: Calling groups_inventory to load vars for managed_node1 11124 1726882363.25301: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.25311: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.25314: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.25316: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.25484: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.25728: done with get_vars() 11124 1726882363.25828: done getting variables 11124 1726882363.26008: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:32:43 -0400 (0:00:00.634) 0:00:03.503 ****** 11124 1726882363.26082: entering _queue_task() for managed_node1/set_fact 11124 1726882363.26084: Creating lock for set_fact 11124 1726882363.27158: worker is 1 (out of 1 available) 11124 1726882363.27170: exiting _queue_task() for managed_node1/set_fact 11124 1726882363.27180: done queuing things up, now waiting for results queue to drain 11124 1726882363.27182: waiting for pending results... 11124 1726882363.27792: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 11124 1726882363.27878: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000e1 11124 1726882363.27892: variable 'ansible_search_path' from source: unknown 11124 1726882363.27895: variable 'ansible_search_path' from source: unknown 11124 1726882363.27932: calling self._execute() 11124 1726882363.28002: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.28007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.28017: variable 'omit' from source: magic vars 11124 1726882363.29216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882363.29661: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882363.29702: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882363.29852: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882363.29882: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882363.30073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882363.30169: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882363.30204: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882363.30235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882363.30584: Evaluated conditional (not __network_is_ostree is defined): True 11124 1726882363.30597: variable 'omit' from source: magic vars 11124 1726882363.30640: variable 'omit' from source: magic vars 11124 1726882363.30866: variable '__ostree_booted_stat' from source: set_fact 11124 1726882363.30949: variable 'omit' from source: magic vars 11124 1726882363.31032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882363.31097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882363.31191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882363.31214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882363.31235: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882363.31299: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882363.31338: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.31347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.31567: Set connection var ansible_shell_executable to /bin/sh 11124 1726882363.31581: Set connection var ansible_shell_type to sh 11124 1726882363.31592: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882363.31668: Set connection var ansible_timeout to 10 11124 1726882363.31680: Set connection var ansible_pipelining to False 11124 1726882363.31687: Set connection var ansible_connection to ssh 11124 1726882363.31715: variable 'ansible_shell_executable' from source: unknown 11124 1726882363.31724: variable 'ansible_connection' from source: unknown 11124 1726882363.31732: variable 'ansible_module_compression' from source: unknown 11124 1726882363.31772: variable 'ansible_shell_type' from source: unknown 11124 1726882363.31781: variable 'ansible_shell_executable' from source: unknown 11124 1726882363.31788: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.31796: variable 'ansible_pipelining' from source: unknown 11124 1726882363.31803: variable 'ansible_timeout' from source: unknown 11124 1726882363.31878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.31976: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882363.32081: variable 'omit' from source: magic vars 11124 1726882363.32093: starting attempt loop 11124 1726882363.32099: running the handler 11124 1726882363.32113: handler run complete 11124 1726882363.32125: attempt loop complete, returning result 11124 1726882363.32130: _execute() done 11124 1726882363.32135: dumping result to json 11124 1726882363.32206: done dumping result, returning 11124 1726882363.32219: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0e448fcc-3ce9-8362-0f62-0000000000e1] 11124 1726882363.32228: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000e1 ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11124 1726882363.32383: no more pending results, returning what we have 11124 1726882363.32386: results queue empty 11124 1726882363.32387: checking for any_errors_fatal 11124 1726882363.32395: done checking for any_errors_fatal 11124 1726882363.32396: checking for max_fail_percentage 11124 1726882363.32397: done checking for max_fail_percentage 11124 1726882363.32398: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.32399: done checking to see if all hosts have failed 11124 1726882363.32400: getting the remaining hosts for this loop 11124 1726882363.32401: done getting the remaining hosts for this loop 11124 1726882363.32406: getting the next task for host managed_node1 11124 1726882363.32414: done getting next task for host managed_node1 11124 1726882363.32416: ^ task is: TASK: Fix CentOS6 Base repo 11124 1726882363.32419: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.32423: getting variables 11124 1726882363.32425: in VariableManager get_vars() 11124 1726882363.32457: Calling all_inventory to load vars for managed_node1 11124 1726882363.32459: Calling groups_inventory to load vars for managed_node1 11124 1726882363.32462: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.32471: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000e1 11124 1726882363.32475: WORKER PROCESS EXITING 11124 1726882363.32486: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.32489: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.32498: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.32709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.32912: done with get_vars() 11124 1726882363.32922: done getting variables 11124 1726882363.33262: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:32:43 -0400 (0:00:00.072) 0:00:03.575 ****** 11124 1726882363.33291: entering _queue_task() for managed_node1/copy 11124 1726882363.34173: worker is 1 (out of 1 available) 11124 1726882363.34186: exiting _queue_task() for managed_node1/copy 11124 1726882363.34397: done queuing things up, now waiting for results queue to drain 11124 1726882363.34399: waiting for pending results... 11124 1726882363.34734: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 11124 1726882363.34955: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000e3 11124 1726882363.34979: variable 'ansible_search_path' from source: unknown 11124 1726882363.34987: variable 'ansible_search_path' from source: unknown 11124 1726882363.35028: calling self._execute() 11124 1726882363.35107: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.35266: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.35280: variable 'omit' from source: magic vars 11124 1726882363.36960: variable 'ansible_distribution' from source: facts 11124 1726882363.36996: Evaluated conditional (ansible_distribution == 'CentOS'): True 11124 1726882363.37128: variable 'ansible_distribution_major_version' from source: facts 11124 1726882363.37142: Evaluated conditional (ansible_distribution_major_version == '6'): False 11124 1726882363.37150: when evaluation is False, skipping this task 11124 1726882363.37157: _execute() done 11124 1726882363.37167: dumping result to json 11124 1726882363.37175: done dumping result, returning 11124 1726882363.37186: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0e448fcc-3ce9-8362-0f62-0000000000e3] 11124 1726882363.37200: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000e3 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11124 1726882363.37389: no more pending results, returning what we have 11124 1726882363.37392: results queue empty 11124 1726882363.37393: checking for any_errors_fatal 11124 1726882363.37398: done checking for any_errors_fatal 11124 1726882363.37399: checking for max_fail_percentage 11124 1726882363.37400: done checking for max_fail_percentage 11124 1726882363.37401: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.37402: done checking to see if all hosts have failed 11124 1726882363.37403: getting the remaining hosts for this loop 11124 1726882363.37404: done getting the remaining hosts for this loop 11124 1726882363.37407: getting the next task for host managed_node1 11124 1726882363.37415: done getting next task for host managed_node1 11124 1726882363.37417: ^ task is: TASK: Include the task 'enable_epel.yml' 11124 1726882363.37421: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.37425: getting variables 11124 1726882363.37427: in VariableManager get_vars() 11124 1726882363.37467: Calling all_inventory to load vars for managed_node1 11124 1726882363.37470: Calling groups_inventory to load vars for managed_node1 11124 1726882363.37474: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.37489: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.37491: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.37495: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.37670: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000e3 11124 1726882363.37674: WORKER PROCESS EXITING 11124 1726882363.37695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.37905: done with get_vars() 11124 1726882363.37917: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:32:43 -0400 (0:00:00.048) 0:00:03.623 ****** 11124 1726882363.38128: entering _queue_task() for managed_node1/include_tasks 11124 1726882363.38914: worker is 1 (out of 1 available) 11124 1726882363.38926: exiting _queue_task() for managed_node1/include_tasks 11124 1726882363.39160: done queuing things up, now waiting for results queue to drain 11124 1726882363.39162: waiting for pending results... 11124 1726882363.39335: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 11124 1726882363.39444: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000e4 11124 1726882363.39468: variable 'ansible_search_path' from source: unknown 11124 1726882363.39475: variable 'ansible_search_path' from source: unknown 11124 1726882363.39521: calling self._execute() 11124 1726882363.39605: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.39616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.39627: variable 'omit' from source: magic vars 11124 1726882363.40140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882363.43426: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882363.43505: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882363.43552: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882363.43592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882363.43619: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882363.43705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882363.43738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882363.43781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882363.43827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882363.43873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882363.44015: variable '__network_is_ostree' from source: set_fact 11124 1726882363.44065: Evaluated conditional (not __network_is_ostree | d(false)): True 11124 1726882363.44079: _execute() done 11124 1726882363.44092: dumping result to json 11124 1726882363.44100: done dumping result, returning 11124 1726882363.44109: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0e448fcc-3ce9-8362-0f62-0000000000e4] 11124 1726882363.44122: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000e4 11124 1726882363.45044: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000e4 11124 1726882363.45048: WORKER PROCESS EXITING 11124 1726882363.45050: no more pending results, returning what we have 11124 1726882363.45055: in VariableManager get_vars() 11124 1726882363.45095: Calling all_inventory to load vars for managed_node1 11124 1726882363.45099: Calling groups_inventory to load vars for managed_node1 11124 1726882363.45103: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.45113: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.45116: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.45119: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.45309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.45559: done with get_vars() 11124 1726882363.45569: variable 'ansible_search_path' from source: unknown 11124 1726882363.45571: variable 'ansible_search_path' from source: unknown 11124 1726882363.45612: we have included files to process 11124 1726882363.45613: generating all_blocks data 11124 1726882363.45615: done generating all_blocks data 11124 1726882363.45620: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11124 1726882363.45622: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11124 1726882363.45624: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11124 1726882363.47898: done processing included file 11124 1726882363.47901: iterating over new_blocks loaded from include file 11124 1726882363.47902: in VariableManager get_vars() 11124 1726882363.47916: done with get_vars() 11124 1726882363.47917: filtering new block on tags 11124 1726882363.47943: done filtering new block on tags 11124 1726882363.47946: in VariableManager get_vars() 11124 1726882363.47957: done with get_vars() 11124 1726882363.47959: filtering new block on tags 11124 1726882363.47974: done filtering new block on tags 11124 1726882363.47976: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 11124 1726882363.47982: extending task lists for all hosts with included blocks 11124 1726882363.48216: done extending task lists 11124 1726882363.48218: done processing included files 11124 1726882363.48219: results queue empty 11124 1726882363.48220: checking for any_errors_fatal 11124 1726882363.48223: done checking for any_errors_fatal 11124 1726882363.48224: checking for max_fail_percentage 11124 1726882363.48225: done checking for max_fail_percentage 11124 1726882363.48226: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.48226: done checking to see if all hosts have failed 11124 1726882363.48227: getting the remaining hosts for this loop 11124 1726882363.48228: done getting the remaining hosts for this loop 11124 1726882363.48231: getting the next task for host managed_node1 11124 1726882363.48235: done getting next task for host managed_node1 11124 1726882363.48237: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11124 1726882363.48240: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.48242: getting variables 11124 1726882363.48243: in VariableManager get_vars() 11124 1726882363.48252: Calling all_inventory to load vars for managed_node1 11124 1726882363.48254: Calling groups_inventory to load vars for managed_node1 11124 1726882363.48257: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.48265: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.48273: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.48276: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.48436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.48657: done with get_vars() 11124 1726882363.48668: done getting variables 11124 1726882363.48738: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11124 1726882363.48971: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:32:43 -0400 (0:00:00.108) 0:00:03.732 ****** 11124 1726882363.49019: entering _queue_task() for managed_node1/command 11124 1726882363.49021: Creating lock for command 11124 1726882363.49367: worker is 1 (out of 1 available) 11124 1726882363.49383: exiting _queue_task() for managed_node1/command 11124 1726882363.49399: done queuing things up, now waiting for results queue to drain 11124 1726882363.49401: waiting for pending results... 11124 1726882363.50901: running TaskExecutor() for managed_node1/TASK: Create EPEL 9 11124 1726882363.51138: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000fe 11124 1726882363.51159: variable 'ansible_search_path' from source: unknown 11124 1726882363.51208: variable 'ansible_search_path' from source: unknown 11124 1726882363.51287: calling self._execute() 11124 1726882363.51542: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.51556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.51592: variable 'omit' from source: magic vars 11124 1726882363.52320: variable 'ansible_distribution' from source: facts 11124 1726882363.52383: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11124 1726882363.52627: variable 'ansible_distribution_major_version' from source: facts 11124 1726882363.52672: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11124 1726882363.52680: when evaluation is False, skipping this task 11124 1726882363.52687: _execute() done 11124 1726882363.52693: dumping result to json 11124 1726882363.52700: done dumping result, returning 11124 1726882363.52778: done running TaskExecutor() for managed_node1/TASK: Create EPEL 9 [0e448fcc-3ce9-8362-0f62-0000000000fe] 11124 1726882363.52789: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000fe skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11124 1726882363.52954: no more pending results, returning what we have 11124 1726882363.52958: results queue empty 11124 1726882363.52959: checking for any_errors_fatal 11124 1726882363.52960: done checking for any_errors_fatal 11124 1726882363.52960: checking for max_fail_percentage 11124 1726882363.52962: done checking for max_fail_percentage 11124 1726882363.52962: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.52965: done checking to see if all hosts have failed 11124 1726882363.52966: getting the remaining hosts for this loop 11124 1726882363.52967: done getting the remaining hosts for this loop 11124 1726882363.52971: getting the next task for host managed_node1 11124 1726882363.52976: done getting next task for host managed_node1 11124 1726882363.52979: ^ task is: TASK: Install yum-utils package 11124 1726882363.52984: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.52987: getting variables 11124 1726882363.52989: in VariableManager get_vars() 11124 1726882363.53017: Calling all_inventory to load vars for managed_node1 11124 1726882363.53019: Calling groups_inventory to load vars for managed_node1 11124 1726882363.53023: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.53037: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.53040: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.53043: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.53205: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000fe 11124 1726882363.53209: WORKER PROCESS EXITING 11124 1726882363.53223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.53423: done with get_vars() 11124 1726882363.53432: done getting variables 11124 1726882363.53525: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:32:43 -0400 (0:00:00.046) 0:00:03.779 ****** 11124 1726882363.53670: entering _queue_task() for managed_node1/package 11124 1726882363.53672: Creating lock for package 11124 1726882363.54134: worker is 1 (out of 1 available) 11124 1726882363.54146: exiting _queue_task() for managed_node1/package 11124 1726882363.54157: done queuing things up, now waiting for results queue to drain 11124 1726882363.54159: waiting for pending results... 11124 1726882363.55002: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 11124 1726882363.55162: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000ff 11124 1726882363.55369: variable 'ansible_search_path' from source: unknown 11124 1726882363.55464: variable 'ansible_search_path' from source: unknown 11124 1726882363.55506: calling self._execute() 11124 1726882363.55616: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.55679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.55711: variable 'omit' from source: magic vars 11124 1726882363.56286: variable 'ansible_distribution' from source: facts 11124 1726882363.56303: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11124 1726882363.56431: variable 'ansible_distribution_major_version' from source: facts 11124 1726882363.56444: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11124 1726882363.56458: when evaluation is False, skipping this task 11124 1726882363.56466: _execute() done 11124 1726882363.56474: dumping result to json 11124 1726882363.56480: done dumping result, returning 11124 1726882363.56490: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0e448fcc-3ce9-8362-0f62-0000000000ff] 11124 1726882363.56501: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000ff skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11124 1726882363.56646: no more pending results, returning what we have 11124 1726882363.56649: results queue empty 11124 1726882363.56651: checking for any_errors_fatal 11124 1726882363.56656: done checking for any_errors_fatal 11124 1726882363.56657: checking for max_fail_percentage 11124 1726882363.56659: done checking for max_fail_percentage 11124 1726882363.56659: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.56661: done checking to see if all hosts have failed 11124 1726882363.56661: getting the remaining hosts for this loop 11124 1726882363.56662: done getting the remaining hosts for this loop 11124 1726882363.56670: getting the next task for host managed_node1 11124 1726882363.56676: done getting next task for host managed_node1 11124 1726882363.56678: ^ task is: TASK: Enable EPEL 7 11124 1726882363.56682: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.56685: getting variables 11124 1726882363.56687: in VariableManager get_vars() 11124 1726882363.56761: Calling all_inventory to load vars for managed_node1 11124 1726882363.56766: Calling groups_inventory to load vars for managed_node1 11124 1726882363.56770: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.56781: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.56784: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.56787: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.56934: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000ff 11124 1726882363.56938: WORKER PROCESS EXITING 11124 1726882363.56951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.57140: done with get_vars() 11124 1726882363.57150: done getting variables 11124 1726882363.57207: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:32:43 -0400 (0:00:00.035) 0:00:03.814 ****** 11124 1726882363.57236: entering _queue_task() for managed_node1/command 11124 1726882363.57978: worker is 1 (out of 1 available) 11124 1726882363.57991: exiting _queue_task() for managed_node1/command 11124 1726882363.58005: done queuing things up, now waiting for results queue to drain 11124 1726882363.58006: waiting for pending results... 11124 1726882363.58690: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 11124 1726882363.58807: in run() - task 0e448fcc-3ce9-8362-0f62-000000000100 11124 1726882363.58827: variable 'ansible_search_path' from source: unknown 11124 1726882363.58834: variable 'ansible_search_path' from source: unknown 11124 1726882363.58884: calling self._execute() 11124 1726882363.58962: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.58976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.58991: variable 'omit' from source: magic vars 11124 1726882363.59535: variable 'ansible_distribution' from source: facts 11124 1726882363.59683: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11124 1726882363.59906: variable 'ansible_distribution_major_version' from source: facts 11124 1726882363.59916: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11124 1726882363.59922: when evaluation is False, skipping this task 11124 1726882363.59929: _execute() done 11124 1726882363.59934: dumping result to json 11124 1726882363.59941: done dumping result, returning 11124 1726882363.60003: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0e448fcc-3ce9-8362-0f62-000000000100] 11124 1726882363.60014: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000100 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11124 1726882363.60159: no more pending results, returning what we have 11124 1726882363.60162: results queue empty 11124 1726882363.60164: checking for any_errors_fatal 11124 1726882363.60170: done checking for any_errors_fatal 11124 1726882363.60170: checking for max_fail_percentage 11124 1726882363.60172: done checking for max_fail_percentage 11124 1726882363.60173: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.60174: done checking to see if all hosts have failed 11124 1726882363.60174: getting the remaining hosts for this loop 11124 1726882363.60176: done getting the remaining hosts for this loop 11124 1726882363.60179: getting the next task for host managed_node1 11124 1726882363.60185: done getting next task for host managed_node1 11124 1726882363.60187: ^ task is: TASK: Enable EPEL 8 11124 1726882363.60192: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.60195: getting variables 11124 1726882363.60196: in VariableManager get_vars() 11124 1726882363.60224: Calling all_inventory to load vars for managed_node1 11124 1726882363.60227: Calling groups_inventory to load vars for managed_node1 11124 1726882363.60231: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.60248: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.60251: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.60256: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.60431: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000100 11124 1726882363.60438: WORKER PROCESS EXITING 11124 1726882363.60453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.60657: done with get_vars() 11124 1726882363.60669: done getting variables 11124 1726882363.60725: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:32:43 -0400 (0:00:00.035) 0:00:03.850 ****** 11124 1726882363.60870: entering _queue_task() for managed_node1/command 11124 1726882363.61910: worker is 1 (out of 1 available) 11124 1726882363.61917: exiting _queue_task() for managed_node1/command 11124 1726882363.61925: done queuing things up, now waiting for results queue to drain 11124 1726882363.61927: waiting for pending results... 11124 1726882363.61945: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 11124 1726882363.61953: in run() - task 0e448fcc-3ce9-8362-0f62-000000000101 11124 1726882363.61956: variable 'ansible_search_path' from source: unknown 11124 1726882363.61959: variable 'ansible_search_path' from source: unknown 11124 1726882363.61961: calling self._execute() 11124 1726882363.62100: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.62113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.62131: variable 'omit' from source: magic vars 11124 1726882363.62489: variable 'ansible_distribution' from source: facts 11124 1726882363.62505: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11124 1726882363.62653: variable 'ansible_distribution_major_version' from source: facts 11124 1726882363.62671: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11124 1726882363.62679: when evaluation is False, skipping this task 11124 1726882363.62685: _execute() done 11124 1726882363.62691: dumping result to json 11124 1726882363.62697: done dumping result, returning 11124 1726882363.62705: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0e448fcc-3ce9-8362-0f62-000000000101] 11124 1726882363.62714: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000101 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11124 1726882363.62863: no more pending results, returning what we have 11124 1726882363.62878: results queue empty 11124 1726882363.62879: checking for any_errors_fatal 11124 1726882363.62887: done checking for any_errors_fatal 11124 1726882363.62888: checking for max_fail_percentage 11124 1726882363.62890: done checking for max_fail_percentage 11124 1726882363.62890: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.62892: done checking to see if all hosts have failed 11124 1726882363.62892: getting the remaining hosts for this loop 11124 1726882363.62894: done getting the remaining hosts for this loop 11124 1726882363.62897: getting the next task for host managed_node1 11124 1726882363.62906: done getting next task for host managed_node1 11124 1726882363.62909: ^ task is: TASK: Enable EPEL 6 11124 1726882363.62913: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.62919: getting variables 11124 1726882363.62921: in VariableManager get_vars() 11124 1726882363.62995: Calling all_inventory to load vars for managed_node1 11124 1726882363.62998: Calling groups_inventory to load vars for managed_node1 11124 1726882363.63003: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.63016: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.63020: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.63023: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.63220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.63442: done with get_vars() 11124 1726882363.63454: done getting variables 11124 1726882363.63613: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000101 11124 1726882363.63616: WORKER PROCESS EXITING 11124 1726882363.63772: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:32:43 -0400 (0:00:00.030) 0:00:03.880 ****** 11124 1726882363.63807: entering _queue_task() for managed_node1/copy 11124 1726882363.64169: worker is 1 (out of 1 available) 11124 1726882363.64181: exiting _queue_task() for managed_node1/copy 11124 1726882363.64196: done queuing things up, now waiting for results queue to drain 11124 1726882363.64198: waiting for pending results... 11124 1726882363.64441: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 11124 1726882363.64545: in run() - task 0e448fcc-3ce9-8362-0f62-000000000103 11124 1726882363.64567: variable 'ansible_search_path' from source: unknown 11124 1726882363.64575: variable 'ansible_search_path' from source: unknown 11124 1726882363.64612: calling self._execute() 11124 1726882363.64696: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.64707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.64719: variable 'omit' from source: magic vars 11124 1726882363.65084: variable 'ansible_distribution' from source: facts 11124 1726882363.65104: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11124 1726882363.65227: variable 'ansible_distribution_major_version' from source: facts 11124 1726882363.65237: Evaluated conditional (ansible_distribution_major_version == '6'): False 11124 1726882363.65244: when evaluation is False, skipping this task 11124 1726882363.65254: _execute() done 11124 1726882363.65260: dumping result to json 11124 1726882363.65270: done dumping result, returning 11124 1726882363.65284: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0e448fcc-3ce9-8362-0f62-000000000103] 11124 1726882363.65298: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000103 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11124 1726882363.65438: no more pending results, returning what we have 11124 1726882363.65442: results queue empty 11124 1726882363.65443: checking for any_errors_fatal 11124 1726882363.65450: done checking for any_errors_fatal 11124 1726882363.65451: checking for max_fail_percentage 11124 1726882363.65453: done checking for max_fail_percentage 11124 1726882363.65454: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.65455: done checking to see if all hosts have failed 11124 1726882363.65456: getting the remaining hosts for this loop 11124 1726882363.65457: done getting the remaining hosts for this loop 11124 1726882363.65461: getting the next task for host managed_node1 11124 1726882363.65473: done getting next task for host managed_node1 11124 1726882363.65475: ^ task is: TASK: Set network provider to 'nm' 11124 1726882363.65478: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.65483: getting variables 11124 1726882363.65485: in VariableManager get_vars() 11124 1726882363.65513: Calling all_inventory to load vars for managed_node1 11124 1726882363.65516: Calling groups_inventory to load vars for managed_node1 11124 1726882363.65519: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.65531: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.65533: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.65536: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.65711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.65920: done with get_vars() 11124 1726882363.65930: done getting variables 11124 1726882363.66008: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:13 Friday 20 September 2024 21:32:43 -0400 (0:00:00.022) 0:00:03.902 ****** 11124 1726882363.66042: entering _queue_task() for managed_node1/set_fact 11124 1726882363.66066: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000103 11124 1726882363.66076: WORKER PROCESS EXITING 11124 1726882363.66468: worker is 1 (out of 1 available) 11124 1726882363.66481: exiting _queue_task() for managed_node1/set_fact 11124 1726882363.66493: done queuing things up, now waiting for results queue to drain 11124 1726882363.66495: waiting for pending results... 11124 1726882363.67389: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 11124 1726882363.67533: in run() - task 0e448fcc-3ce9-8362-0f62-000000000007 11124 1726882363.67584: variable 'ansible_search_path' from source: unknown 11124 1726882363.67637: calling self._execute() 11124 1726882363.67799: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.67815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.67834: variable 'omit' from source: magic vars 11124 1726882363.67950: variable 'omit' from source: magic vars 11124 1726882363.67987: variable 'omit' from source: magic vars 11124 1726882363.68023: variable 'omit' from source: magic vars 11124 1726882363.68074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882363.68112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882363.68136: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882363.68165: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882363.68182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882363.68211: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882363.68219: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.68226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.68335: Set connection var ansible_shell_executable to /bin/sh 11124 1726882363.68352: Set connection var ansible_shell_type to sh 11124 1726882363.68367: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882363.68383: Set connection var ansible_timeout to 10 11124 1726882363.68392: Set connection var ansible_pipelining to False 11124 1726882363.68398: Set connection var ansible_connection to ssh 11124 1726882363.68421: variable 'ansible_shell_executable' from source: unknown 11124 1726882363.68429: variable 'ansible_connection' from source: unknown 11124 1726882363.68436: variable 'ansible_module_compression' from source: unknown 11124 1726882363.68442: variable 'ansible_shell_type' from source: unknown 11124 1726882363.68451: variable 'ansible_shell_executable' from source: unknown 11124 1726882363.68458: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.68468: variable 'ansible_pipelining' from source: unknown 11124 1726882363.68475: variable 'ansible_timeout' from source: unknown 11124 1726882363.68489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.68635: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882363.68655: variable 'omit' from source: magic vars 11124 1726882363.68668: starting attempt loop 11124 1726882363.68676: running the handler 11124 1726882363.68691: handler run complete 11124 1726882363.68714: attempt loop complete, returning result 11124 1726882363.68721: _execute() done 11124 1726882363.68727: dumping result to json 11124 1726882363.68735: done dumping result, returning 11124 1726882363.68745: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0e448fcc-3ce9-8362-0f62-000000000007] 11124 1726882363.68757: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000007 ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11124 1726882363.68958: no more pending results, returning what we have 11124 1726882363.68960: results queue empty 11124 1726882363.68962: checking for any_errors_fatal 11124 1726882363.68966: done checking for any_errors_fatal 11124 1726882363.68967: checking for max_fail_percentage 11124 1726882363.68969: done checking for max_fail_percentage 11124 1726882363.68970: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.68971: done checking to see if all hosts have failed 11124 1726882363.68972: getting the remaining hosts for this loop 11124 1726882363.68973: done getting the remaining hosts for this loop 11124 1726882363.68977: getting the next task for host managed_node1 11124 1726882363.68984: done getting next task for host managed_node1 11124 1726882363.68986: ^ task is: TASK: meta (flush_handlers) 11124 1726882363.68988: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.68992: getting variables 11124 1726882363.68994: in VariableManager get_vars() 11124 1726882363.69018: Calling all_inventory to load vars for managed_node1 11124 1726882363.69021: Calling groups_inventory to load vars for managed_node1 11124 1726882363.69024: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.69035: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.69038: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.69040: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.69190: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.69369: done with get_vars() 11124 1726882363.69382: done getting variables 11124 1726882363.69444: in VariableManager get_vars() 11124 1726882363.69456: Calling all_inventory to load vars for managed_node1 11124 1726882363.69459: Calling groups_inventory to load vars for managed_node1 11124 1726882363.69462: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.69468: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.69470: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.69473: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.70885: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000007 11124 1726882363.70888: WORKER PROCESS EXITING 11124 1726882363.71077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.71470: done with get_vars() 11124 1726882363.71486: done queuing things up, now waiting for results queue to drain 11124 1726882363.71488: results queue empty 11124 1726882363.71489: checking for any_errors_fatal 11124 1726882363.71491: done checking for any_errors_fatal 11124 1726882363.71492: checking for max_fail_percentage 11124 1726882363.71493: done checking for max_fail_percentage 11124 1726882363.71494: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.71495: done checking to see if all hosts have failed 11124 1726882363.71495: getting the remaining hosts for this loop 11124 1726882363.71497: done getting the remaining hosts for this loop 11124 1726882363.71499: getting the next task for host managed_node1 11124 1726882363.71503: done getting next task for host managed_node1 11124 1726882363.71505: ^ task is: TASK: meta (flush_handlers) 11124 1726882363.71506: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.71515: getting variables 11124 1726882363.71516: in VariableManager get_vars() 11124 1726882363.71524: Calling all_inventory to load vars for managed_node1 11124 1726882363.71527: Calling groups_inventory to load vars for managed_node1 11124 1726882363.71529: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.71534: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.71653: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.71657: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.72095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.72284: done with get_vars() 11124 1726882363.72510: done getting variables 11124 1726882363.72585: in VariableManager get_vars() 11124 1726882363.72593: Calling all_inventory to load vars for managed_node1 11124 1726882363.72596: Calling groups_inventory to load vars for managed_node1 11124 1726882363.72598: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.72602: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.72605: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.72608: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.72891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.73397: done with get_vars() 11124 1726882363.73408: done queuing things up, now waiting for results queue to drain 11124 1726882363.73410: results queue empty 11124 1726882363.73410: checking for any_errors_fatal 11124 1726882363.73412: done checking for any_errors_fatal 11124 1726882363.73412: checking for max_fail_percentage 11124 1726882363.73413: done checking for max_fail_percentage 11124 1726882363.73414: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.73415: done checking to see if all hosts have failed 11124 1726882363.73415: getting the remaining hosts for this loop 11124 1726882363.73416: done getting the remaining hosts for this loop 11124 1726882363.73428: getting the next task for host managed_node1 11124 1726882363.73432: done getting next task for host managed_node1 11124 1726882363.73432: ^ task is: None 11124 1726882363.73440: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.73442: done queuing things up, now waiting for results queue to drain 11124 1726882363.73450: results queue empty 11124 1726882363.73451: checking for any_errors_fatal 11124 1726882363.73452: done checking for any_errors_fatal 11124 1726882363.73452: checking for max_fail_percentage 11124 1726882363.73454: done checking for max_fail_percentage 11124 1726882363.73454: checking to see if all hosts have failed and the running result is not ok 11124 1726882363.73455: done checking to see if all hosts have failed 11124 1726882363.73458: getting the next task for host managed_node1 11124 1726882363.73461: done getting next task for host managed_node1 11124 1726882363.73462: ^ task is: None 11124 1726882363.73464: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.73570: in VariableManager get_vars() 11124 1726882363.73599: done with get_vars() 11124 1726882363.73605: in VariableManager get_vars() 11124 1726882363.73622: done with get_vars() 11124 1726882363.73626: variable 'omit' from source: magic vars 11124 1726882363.73666: in VariableManager get_vars() 11124 1726882363.73683: done with get_vars() 11124 1726882363.73711: variable 'omit' from source: magic vars PLAY [Play for testing bond device using deprecated 'master' argument] ********* 11124 1726882363.75300: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11124 1726882363.75505: getting the remaining hosts for this loop 11124 1726882363.75507: done getting the remaining hosts for this loop 11124 1726882363.75514: getting the next task for host managed_node1 11124 1726882363.75517: done getting next task for host managed_node1 11124 1726882363.75519: ^ task is: TASK: Gathering Facts 11124 1726882363.75520: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882363.75522: getting variables 11124 1726882363.75523: in VariableManager get_vars() 11124 1726882363.75535: Calling all_inventory to load vars for managed_node1 11124 1726882363.75538: Calling groups_inventory to load vars for managed_node1 11124 1726882363.75539: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882363.75551: Calling all_plugins_play to load vars for managed_node1 11124 1726882363.75566: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882363.75570: Calling groups_plugins_play to load vars for managed_node1 11124 1726882363.75714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882363.75939: done with get_vars() 11124 1726882363.75946: done getting variables 11124 1726882363.75994: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 Friday 20 September 2024 21:32:43 -0400 (0:00:00.099) 0:00:04.002 ****** 11124 1726882363.76016: entering _queue_task() for managed_node1/gather_facts 11124 1726882363.76378: worker is 1 (out of 1 available) 11124 1726882363.76388: exiting _queue_task() for managed_node1/gather_facts 11124 1726882363.76400: done queuing things up, now waiting for results queue to drain 11124 1726882363.76401: waiting for pending results... 11124 1726882363.76709: running TaskExecutor() for managed_node1/TASK: Gathering Facts 11124 1726882363.76819: in run() - task 0e448fcc-3ce9-8362-0f62-000000000129 11124 1726882363.76844: variable 'ansible_search_path' from source: unknown 11124 1726882363.76897: calling self._execute() 11124 1726882363.76991: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.77002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.77015: variable 'omit' from source: magic vars 11124 1726882363.77644: variable 'ansible_distribution_major_version' from source: facts 11124 1726882363.77668: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882363.77680: variable 'omit' from source: magic vars 11124 1726882363.77710: variable 'omit' from source: magic vars 11124 1726882363.77758: variable 'omit' from source: magic vars 11124 1726882363.77804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882363.77842: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882363.77881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882363.77904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882363.77921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882363.77957: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882363.77975: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.77985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.78208: Set connection var ansible_shell_executable to /bin/sh 11124 1726882363.78222: Set connection var ansible_shell_type to sh 11124 1726882363.78235: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882363.78302: Set connection var ansible_timeout to 10 11124 1726882363.78313: Set connection var ansible_pipelining to False 11124 1726882363.78320: Set connection var ansible_connection to ssh 11124 1726882363.78345: variable 'ansible_shell_executable' from source: unknown 11124 1726882363.78414: variable 'ansible_connection' from source: unknown 11124 1726882363.78423: variable 'ansible_module_compression' from source: unknown 11124 1726882363.78431: variable 'ansible_shell_type' from source: unknown 11124 1726882363.78438: variable 'ansible_shell_executable' from source: unknown 11124 1726882363.78445: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882363.78457: variable 'ansible_pipelining' from source: unknown 11124 1726882363.78466: variable 'ansible_timeout' from source: unknown 11124 1726882363.78479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882363.78899: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882363.78965: variable 'omit' from source: magic vars 11124 1726882363.78978: starting attempt loop 11124 1726882363.79066: running the handler 11124 1726882363.79087: variable 'ansible_facts' from source: unknown 11124 1726882363.79112: _low_level_execute_command(): starting 11124 1726882363.79126: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882363.80189: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882363.80208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.80230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.80253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.80353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.80378: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882363.80391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.80407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882363.80416: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882363.80424: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882363.80434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.80445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.80463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.80476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.80492: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882363.80508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.80612: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882363.80633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882363.80660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882363.80793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11124 1726882363.82690: stdout chunk (state=3): >>>/root <<< 11124 1726882363.82872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882363.82875: stdout chunk (state=3): >>><<< 11124 1726882363.82878: stderr chunk (state=3): >>><<< 11124 1726882363.83000: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11124 1726882363.83004: _low_level_execute_command(): starting 11124 1726882363.83008: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887 `" && echo ansible-tmp-1726882363.8291066-11318-226688755348887="` echo /root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887 `" ) && sleep 0' 11124 1726882363.85124: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882363.85139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.85158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.85180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.85222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.85234: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882363.85251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.85271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882363.85285: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882363.85296: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882363.85309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.85323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.85338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.85354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.85368: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882363.85383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.85461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882363.85486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882363.85503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882363.85630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882363.87480: stdout chunk (state=3): >>>ansible-tmp-1726882363.8291066-11318-226688755348887=/root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887 <<< 11124 1726882363.87669: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882363.87673: stdout chunk (state=3): >>><<< 11124 1726882363.87686: stderr chunk (state=3): >>><<< 11124 1726882363.87870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882363.8291066-11318-226688755348887=/root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882363.87874: variable 'ansible_module_compression' from source: unknown 11124 1726882363.87877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11124 1726882363.87880: variable 'ansible_facts' from source: unknown 11124 1726882363.88014: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887/AnsiballZ_setup.py 11124 1726882363.89084: Sending initial data 11124 1726882363.89094: Sent initial data (154 bytes) 11124 1726882363.92254: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882363.92297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.92315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.92338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.92424: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.92568: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882363.92583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.92607: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882363.92620: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882363.92632: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882363.92644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882363.92665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882363.92709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882363.92724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882363.92737: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882363.92754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882363.92854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882363.92906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882363.92923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882363.93052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882363.94794: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882363.94897: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882363.94993: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp8lpd1ci6 /root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887/AnsiballZ_setup.py <<< 11124 1726882363.95085: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882363.98070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882363.98155: stderr chunk (state=3): >>><<< 11124 1726882363.98158: stdout chunk (state=3): >>><<< 11124 1726882363.98188: done transferring module to remote 11124 1726882363.98197: _low_level_execute_command(): starting 11124 1726882363.98203: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887/ /root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887/AnsiballZ_setup.py && sleep 0' 11124 1726882364.00172: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882364.00284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.00306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.00325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.00372: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.00478: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882364.00494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.00519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882364.00534: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882364.00547: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882364.00565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.00581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.00597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.00614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.00631: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882364.00652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.00730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882364.00859: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882364.00878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882364.01072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882364.02874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882364.02878: stdout chunk (state=3): >>><<< 11124 1726882364.02880: stderr chunk (state=3): >>><<< 11124 1726882364.02984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882364.02987: _low_level_execute_command(): starting 11124 1726882364.02990: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887/AnsiballZ_setup.py && sleep 0' 11124 1726882364.04534: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882364.04546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.04561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.04632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.04677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.04687: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882364.04698: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.04713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882364.04731: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882364.04740: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882364.04843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.04860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.04878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.04888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.04897: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882364.04908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.04994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882364.05009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882364.05067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882364.05201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882364.69240: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-<<< 11124 1726882364.69255: stdout chunk (state=3): >>>8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2797, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 735, "free": 2797}, "nocache": {"free": 3248, "used": 284}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_secon<<< 11124 1726882364.69276: stdout chunk (state=3): >>>ds": 522, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264240234496, "block_size": 4096, "block_total": 65519355, "block_available": 64511776, "block_used": 1007579, "inode_total": 131071472, "inode_available": 130998718, "inode_used": 72754, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on",<<< 11124 1726882364.69718: stdout chunk (state=3): >>> "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.58, "5m": 0.32, "15m": 0.15}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "44", "epoch": "1726882364", "epoch_int": "1726882364", "date": "2024-09-20", "time": "21:32:44", "iso8601_micro": "2024-09-21T01:32:44.686613Z", "iso8601": "2024-09-21T01:32:44Z", "iso8601_basic": "20240920T213244686613", "iso8601_basic_short": "20240920T213244", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11124 1726882364.71796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882364.71800: stdout chunk (state=3): >>><<< 11124 1726882364.71803: stderr chunk (state=3): >>><<< 11124 1726882364.71976: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-508.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 15:49:37 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-44-90", "ansible_nodename": "ip-10-31-44-90.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "465293f2bd9b457497a5eaf565f184f8", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-508.el9.x86_64", "root": "UUID=6c640f10-8261-4074-b9b8-2cdc3ddcc013", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBANd7RrvRqB/kTBmp4g3vOXSd1XQX7zSKmnuTmpsZ60vmB0PwXJIj7HTI9DBqGrzofXOOHlh/Qs4ml+D2H79EO+6EUEZza9meZ+anJVVwXBuxgfn1Hl/EExsmp7gV5o+r0rCFeuFds6ISUZQpal2OlQk3lrit00RA//JoxvQO1YQdAAAAFQDxLJGSeERY5sZYyGr88om1BNq9KQAAAIEA1WO5ElHMof6Upf9GQZn4wlJh6JUOjZfYnL4XATh/W6uye7kuC/rBqGPirkmks1GCUkKhSkzQlRIPyLYENrbPKRMNarYAwwQ8N8BDOUWFDCcrO55SJdlLTyYGWdlVFysYGSMIyZT5ye4oL3Nkff/e1ZGqjvL0sLNJaye4za70Xj4AAACAXRnpJPDKgCzakgHO14HsH3r7qS4cvfQld+ThBJZcbtKtJUyUo1sqg9NbvK+hvA41TYWOVy52DcWn5yzPwfhAn+mQEcAdBXWqSSJdESZ9fPnbc9C1XO02sYpyX5+wsZnau23XhtnlnY8jRTpWgRt4X8AWaSND9mfeL8ZDTgVphRc=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCi3knQjBMOMykE1WC1CWkOsV5q0hmL0iOCSTq050rqbxjUmkXoe4BeOWZh4ID8k5GgD5emRh5PU5ME25rsP6hnelUBzDOCjdgI1rmTkUbw5oFRN+kpX2RbAM/2L5J6RrqqllFBjrHtSG6JINsPIWqLn9rlmE965Kj4AY3QplgiApeN07yDM5YPS2tpRpVHVIxZia7pdzKs+h+TXJoo/Z7SMAw8MTUNeIXd9bSzuxhulOrLNrvrcej4EVg88FYiy96oA+NBNLFb41RHNLumM3xUQvjCeyqP1hcUw9YAY+gwADhRGBKQ+JCAzzLqyM/3RAO8gXXJr1Yjr+H9xi8IwKB71QU/lw7bWc33YuNbe5mDlUHQ/a2qvo4O96wD8m4eZpu81iHiwtIU5cwKm+fk8sz9kxOR77AozaYLtjgW9FYUmxh2ChfTBV2rnFflhC3CjgRMlZv8CLtne5JcsRFSpHeCB2RXzA1JPiF89OxoUFa8NsPqdUyAMUkaR8MmW+fj+t8=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAqf0PBdBRTqmYqGYFABEC2soCa/Gsi3A2munUEZEo0enWjwRQivB5pKJDjOn6lwgGyr2ebU0/VpRzddk73uEfk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIOdij61fBkKDElS700Z560nYW2c4QCIx/VplUW7jn+UE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2797, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 735, "free": 2797}, "nocache": {"free": 3248, "used": 284}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_uuid": "ec2ca037-5a7b-1ebc-28c4-7fb32adce23a", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["6c640f10-8261-4074-b9b8-2cdc3ddcc013"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 522, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264240234496, "block_size": 4096, "block_total": 65519355, "block_available": 64511776, "block_used": 1007579, "inode_total": 131071472, "inode_available": 130998718, "inode_used": 72754, "uuid": "6c640f10-8261-4074-b9b8-2cdc3ddcc013"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22"}, "ipv6": [{"address": "fe80::9e:a1ff:fe0b:f96d", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.44.1", "interface": "eth0", "address": "10.31.44.90", "broadcast": "10.31.47.255", "netmask": "255.255.252.0", "network": "10.31.44.0", "prefix": "22", "macaddress": "02:9e:a1:0b:f9:6d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.44.90"], "ansible_all_ipv6_addresses": ["fe80::9e:a1ff:fe0b:f96d"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.44.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::9e:a1ff:fe0b:f96d"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_loadavg": {"1m": 0.58, "5m": 0.32, "15m": 0.15}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.40.7 34614 10.31.44.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.40.7 34614 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_service_mgr": "systemd", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "32", "second": "44", "epoch": "1726882364", "epoch_int": "1726882364", "date": "2024-09-20", "time": "21:32:44", "iso8601_micro": "2024-09-21T01:32:44.686613Z", "iso8601": "2024-09-21T01:32:44Z", "iso8601_basic": "20240920T213244686613", "iso8601_basic_short": "20240920T213244", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:ef4e1c39-6f50-438a-87e7-12fb70b80bde", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882364.72207: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882364.72235: _low_level_execute_command(): starting 11124 1726882364.72245: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882363.8291066-11318-226688755348887/ > /dev/null 2>&1 && sleep 0' 11124 1726882364.73650: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882364.73671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.73687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.73706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.73753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.73767: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882364.73783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.73800: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882364.73811: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882364.73820: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882364.73830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.73843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.73867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.73881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.73898: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882364.73913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.73998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882364.74026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882364.74044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882364.74198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882364.76785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882364.76871: stderr chunk (state=3): >>><<< 11124 1726882364.76874: stdout chunk (state=3): >>><<< 11124 1726882364.77041: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882364.77044: handler run complete 11124 1726882364.77139: variable 'ansible_facts' from source: unknown 11124 1726882364.77237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882364.77628: variable 'ansible_facts' from source: unknown 11124 1726882364.77731: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882364.77871: attempt loop complete, returning result 11124 1726882364.77881: _execute() done 11124 1726882364.77887: dumping result to json 11124 1726882364.77935: done dumping result, returning 11124 1726882364.77948: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0e448fcc-3ce9-8362-0f62-000000000129] 11124 1726882364.77957: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000129 ok: [managed_node1] 11124 1726882364.78555: no more pending results, returning what we have 11124 1726882364.78558: results queue empty 11124 1726882364.78559: checking for any_errors_fatal 11124 1726882364.78560: done checking for any_errors_fatal 11124 1726882364.78561: checking for max_fail_percentage 11124 1726882364.78562: done checking for max_fail_percentage 11124 1726882364.78870: checking to see if all hosts have failed and the running result is not ok 11124 1726882364.78873: done checking to see if all hosts have failed 11124 1726882364.78873: getting the remaining hosts for this loop 11124 1726882364.78875: done getting the remaining hosts for this loop 11124 1726882364.78878: getting the next task for host managed_node1 11124 1726882364.78884: done getting next task for host managed_node1 11124 1726882364.78886: ^ task is: TASK: meta (flush_handlers) 11124 1726882364.78888: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882364.78891: getting variables 11124 1726882364.78893: in VariableManager get_vars() 11124 1726882364.78964: Calling all_inventory to load vars for managed_node1 11124 1726882364.78967: Calling groups_inventory to load vars for managed_node1 11124 1726882364.78970: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882364.79027: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000129 11124 1726882364.79031: WORKER PROCESS EXITING 11124 1726882364.79056: Calling all_plugins_play to load vars for managed_node1 11124 1726882364.79059: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882364.79063: Calling groups_plugins_play to load vars for managed_node1 11124 1726882364.79296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882364.79529: done with get_vars() 11124 1726882364.79539: done getting variables 11124 1726882364.79615: in VariableManager get_vars() 11124 1726882364.79628: Calling all_inventory to load vars for managed_node1 11124 1726882364.79631: Calling groups_inventory to load vars for managed_node1 11124 1726882364.79633: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882364.79636: Calling all_plugins_play to load vars for managed_node1 11124 1726882364.79639: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882364.79645: Calling groups_plugins_play to load vars for managed_node1 11124 1726882364.79815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882364.80029: done with get_vars() 11124 1726882364.80042: done queuing things up, now waiting for results queue to drain 11124 1726882364.80044: results queue empty 11124 1726882364.80044: checking for any_errors_fatal 11124 1726882364.80049: done checking for any_errors_fatal 11124 1726882364.80050: checking for max_fail_percentage 11124 1726882364.80051: done checking for max_fail_percentage 11124 1726882364.80052: checking to see if all hosts have failed and the running result is not ok 11124 1726882364.80053: done checking to see if all hosts have failed 11124 1726882364.80053: getting the remaining hosts for this loop 11124 1726882364.80054: done getting the remaining hosts for this loop 11124 1726882364.80057: getting the next task for host managed_node1 11124 1726882364.80060: done getting next task for host managed_node1 11124 1726882364.80062: ^ task is: TASK: INIT Prepare setup 11124 1726882364.80065: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882364.80067: getting variables 11124 1726882364.80068: in VariableManager get_vars() 11124 1726882364.80080: Calling all_inventory to load vars for managed_node1 11124 1726882364.80082: Calling groups_inventory to load vars for managed_node1 11124 1726882364.80084: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882364.80091: Calling all_plugins_play to load vars for managed_node1 11124 1726882364.80093: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882364.80096: Calling groups_plugins_play to load vars for managed_node1 11124 1726882364.80237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882364.80440: done with get_vars() 11124 1726882364.80451: done getting variables 11124 1726882364.80539: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:15 Friday 20 September 2024 21:32:44 -0400 (0:00:01.045) 0:00:05.048 ****** 11124 1726882364.80570: entering _queue_task() for managed_node1/debug 11124 1726882364.80573: Creating lock for debug 11124 1726882364.81158: worker is 1 (out of 1 available) 11124 1726882364.81172: exiting _queue_task() for managed_node1/debug 11124 1726882364.81181: done queuing things up, now waiting for results queue to drain 11124 1726882364.81183: waiting for pending results... 11124 1726882364.81465: running TaskExecutor() for managed_node1/TASK: INIT Prepare setup 11124 1726882364.81566: in run() - task 0e448fcc-3ce9-8362-0f62-00000000000b 11124 1726882364.81584: variable 'ansible_search_path' from source: unknown 11124 1726882364.81627: calling self._execute() 11124 1726882364.81714: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882364.81741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882364.81782: variable 'omit' from source: magic vars 11124 1726882364.82442: variable 'ansible_distribution_major_version' from source: facts 11124 1726882364.82467: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882364.82480: variable 'omit' from source: magic vars 11124 1726882364.82510: variable 'omit' from source: magic vars 11124 1726882364.82553: variable 'omit' from source: magic vars 11124 1726882364.82600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882364.82642: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882364.82674: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882364.82697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882364.82715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882364.82756: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882364.82769: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882364.82777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882364.82890: Set connection var ansible_shell_executable to /bin/sh 11124 1726882364.82904: Set connection var ansible_shell_type to sh 11124 1726882364.82916: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882364.82926: Set connection var ansible_timeout to 10 11124 1726882364.82937: Set connection var ansible_pipelining to False 11124 1726882364.82946: Set connection var ansible_connection to ssh 11124 1726882364.82976: variable 'ansible_shell_executable' from source: unknown 11124 1726882364.82984: variable 'ansible_connection' from source: unknown 11124 1726882364.82992: variable 'ansible_module_compression' from source: unknown 11124 1726882364.82999: variable 'ansible_shell_type' from source: unknown 11124 1726882364.83005: variable 'ansible_shell_executable' from source: unknown 11124 1726882364.83012: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882364.83019: variable 'ansible_pipelining' from source: unknown 11124 1726882364.83025: variable 'ansible_timeout' from source: unknown 11124 1726882364.83032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882364.83186: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882364.83202: variable 'omit' from source: magic vars 11124 1726882364.83211: starting attempt loop 11124 1726882364.83218: running the handler 11124 1726882364.83277: handler run complete 11124 1726882364.83305: attempt loop complete, returning result 11124 1726882364.83312: _execute() done 11124 1726882364.83319: dumping result to json 11124 1726882364.83325: done dumping result, returning 11124 1726882364.83337: done running TaskExecutor() for managed_node1/TASK: INIT Prepare setup [0e448fcc-3ce9-8362-0f62-00000000000b] 11124 1726882364.83347: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000000b 11124 1726882364.83462: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000000b 11124 1726882364.83473: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 11124 1726882364.83538: no more pending results, returning what we have 11124 1726882364.83542: results queue empty 11124 1726882364.83543: checking for any_errors_fatal 11124 1726882364.83544: done checking for any_errors_fatal 11124 1726882364.83545: checking for max_fail_percentage 11124 1726882364.83547: done checking for max_fail_percentage 11124 1726882364.83550: checking to see if all hosts have failed and the running result is not ok 11124 1726882364.83551: done checking to see if all hosts have failed 11124 1726882364.83552: getting the remaining hosts for this loop 11124 1726882364.83553: done getting the remaining hosts for this loop 11124 1726882364.83557: getting the next task for host managed_node1 11124 1726882364.83566: done getting next task for host managed_node1 11124 1726882364.83570: ^ task is: TASK: Install dnsmasq 11124 1726882364.83574: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882364.83577: getting variables 11124 1726882364.83580: in VariableManager get_vars() 11124 1726882364.83623: Calling all_inventory to load vars for managed_node1 11124 1726882364.83626: Calling groups_inventory to load vars for managed_node1 11124 1726882364.83629: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882364.83641: Calling all_plugins_play to load vars for managed_node1 11124 1726882364.83644: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882364.83650: Calling groups_plugins_play to load vars for managed_node1 11124 1726882364.83831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882364.84095: done with get_vars() 11124 1726882364.84112: done getting variables 11124 1726882364.84358: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:32:44 -0400 (0:00:00.038) 0:00:05.086 ****** 11124 1726882364.84395: entering _queue_task() for managed_node1/package 11124 1726882364.85045: worker is 1 (out of 1 available) 11124 1726882364.85058: exiting _queue_task() for managed_node1/package 11124 1726882364.85074: done queuing things up, now waiting for results queue to drain 11124 1726882364.85076: waiting for pending results... 11124 1726882364.85971: running TaskExecutor() for managed_node1/TASK: Install dnsmasq 11124 1726882364.86102: in run() - task 0e448fcc-3ce9-8362-0f62-00000000000f 11124 1726882364.86141: variable 'ansible_search_path' from source: unknown 11124 1726882364.86151: variable 'ansible_search_path' from source: unknown 11124 1726882364.86195: calling self._execute() 11124 1726882364.86288: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882364.86299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882364.86312: variable 'omit' from source: magic vars 11124 1726882364.86686: variable 'ansible_distribution_major_version' from source: facts 11124 1726882364.86704: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882364.86716: variable 'omit' from source: magic vars 11124 1726882364.86771: variable 'omit' from source: magic vars 11124 1726882364.86970: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882364.89194: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882364.89272: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882364.89314: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882364.89355: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882364.89393: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882364.89498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882364.89531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882364.89566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882364.89618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882364.89638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882364.89757: variable '__network_is_ostree' from source: set_fact 11124 1726882364.89771: variable 'omit' from source: magic vars 11124 1726882364.89805: variable 'omit' from source: magic vars 11124 1726882364.89841: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882364.89878: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882364.89901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882364.89928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882364.89943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882364.89982: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882364.89991: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882364.89999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882364.90103: Set connection var ansible_shell_executable to /bin/sh 11124 1726882364.90118: Set connection var ansible_shell_type to sh 11124 1726882364.90130: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882364.90146: Set connection var ansible_timeout to 10 11124 1726882364.90160: Set connection var ansible_pipelining to False 11124 1726882364.90169: Set connection var ansible_connection to ssh 11124 1726882364.90197: variable 'ansible_shell_executable' from source: unknown 11124 1726882364.90205: variable 'ansible_connection' from source: unknown 11124 1726882364.90213: variable 'ansible_module_compression' from source: unknown 11124 1726882364.90219: variable 'ansible_shell_type' from source: unknown 11124 1726882364.90225: variable 'ansible_shell_executable' from source: unknown 11124 1726882364.90231: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882364.90238: variable 'ansible_pipelining' from source: unknown 11124 1726882364.90252: variable 'ansible_timeout' from source: unknown 11124 1726882364.90263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882364.90379: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882364.90395: variable 'omit' from source: magic vars 11124 1726882364.90404: starting attempt loop 11124 1726882364.90411: running the handler 11124 1726882364.90422: variable 'ansible_facts' from source: unknown 11124 1726882364.90430: variable 'ansible_facts' from source: unknown 11124 1726882364.90475: _low_level_execute_command(): starting 11124 1726882364.90487: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882364.91233: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882364.91246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.91262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.91280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.91321: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.91333: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882364.91353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.91371: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882364.91381: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882364.91390: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882364.91401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.91413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.91426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.91436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.91447: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882364.91465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.91539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882364.91578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882364.91593: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882364.91728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882364.93983: stdout chunk (state=3): >>>/root <<< 11124 1726882364.94133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882364.94244: stderr chunk (state=3): >>><<< 11124 1726882364.94262: stdout chunk (state=3): >>><<< 11124 1726882364.94405: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882364.94409: _low_level_execute_command(): starting 11124 1726882364.94412: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772 `" && echo ansible-tmp-1726882364.9430037-11380-95782258895772="` echo /root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772 `" ) && sleep 0' 11124 1726882364.95078: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882364.95094: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.95109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.95129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.95186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.95199: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882364.95211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.95225: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882364.95235: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882364.95243: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882364.95255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882364.95270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882364.95292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882364.95301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882364.95309: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882364.95320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882364.95407: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882364.95427: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882364.95439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882364.95584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882364.98320: stdout chunk (state=3): >>>ansible-tmp-1726882364.9430037-11380-95782258895772=/root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772 <<< 11124 1726882364.98487: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882364.98566: stderr chunk (state=3): >>><<< 11124 1726882364.98570: stdout chunk (state=3): >>><<< 11124 1726882364.98670: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882364.9430037-11380-95782258895772=/root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882364.98673: variable 'ansible_module_compression' from source: unknown 11124 1726882364.98871: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11124 1726882364.98875: ANSIBALLZ: Acquiring lock 11124 1726882364.98877: ANSIBALLZ: Lock acquired: 139628947188928 11124 1726882364.98880: ANSIBALLZ: Creating module 11124 1726882365.19656: ANSIBALLZ: Writing module into payload 11124 1726882365.19929: ANSIBALLZ: Writing module 11124 1726882365.20601: ANSIBALLZ: Renaming module 11124 1726882365.20615: ANSIBALLZ: Done creating module 11124 1726882365.20647: variable 'ansible_facts' from source: unknown 11124 1726882365.20754: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772/AnsiballZ_dnf.py 11124 1726882365.21530: Sending initial data 11124 1726882365.21533: Sent initial data (151 bytes) 11124 1726882365.23136: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882365.23157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882365.23182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882365.23185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882365.23218: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882365.23234: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882365.23249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882365.23271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882365.23284: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882365.23294: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882365.23311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882365.23327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882365.23348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882365.23362: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882365.23375: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882365.23391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882365.23472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882365.23501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882365.23517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882365.23653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882365.26278: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882365.26381: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882365.26485: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp6kx83gc0 /root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772/AnsiballZ_dnf.py <<< 11124 1726882365.26575: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882365.28383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882365.28620: stderr chunk (state=3): >>><<< 11124 1726882365.28625: stdout chunk (state=3): >>><<< 11124 1726882365.28628: done transferring module to remote 11124 1726882365.28631: _low_level_execute_command(): starting 11124 1726882365.28634: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772/ /root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772/AnsiballZ_dnf.py && sleep 0' 11124 1726882365.29200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882365.29214: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882365.29226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882365.29243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882365.29288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882365.29300: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882365.29312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882365.29328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882365.29339: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882365.29349: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882365.29362: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882365.29379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882365.29396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882365.29408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882365.29419: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882365.29433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882365.29569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882365.29589: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882365.29606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882365.29738: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882365.32552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882365.32557: stdout chunk (state=3): >>><<< 11124 1726882365.32559: stderr chunk (state=3): >>><<< 11124 1726882365.32569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882365.32572: _low_level_execute_command(): starting 11124 1726882365.32580: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772/AnsiballZ_dnf.py && sleep 0' 11124 1726882365.33439: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882365.33455: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882365.33473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882365.33498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882365.33541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882365.33553: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882365.33569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882365.33591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882365.33609: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882365.33620: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882365.33631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882365.33644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882365.33659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882365.33680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882365.33693: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882365.33716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882365.34139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882365.34159: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882365.34283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882368.37622: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.85-16.el9.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11124 1726882368.44675: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882368.44775: stderr chunk (state=3): >>><<< 11124 1726882368.44786: stdout chunk (state=3): >>><<< 11124 1726882368.44938: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.85-16.el9.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882368.44941: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882368.44944: _low_level_execute_command(): starting 11124 1726882368.44946: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882364.9430037-11380-95782258895772/ > /dev/null 2>&1 && sleep 0' 11124 1726882368.45834: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882368.45846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.45860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.45880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.45952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.45964: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882368.45977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.45992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882368.46001: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882368.46010: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882368.46023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.46039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.46053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.46066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.46077: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882368.46091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.46176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882368.46194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882368.46209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882368.46334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882368.48315: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882368.48319: stdout chunk (state=3): >>><<< 11124 1726882368.48326: stderr chunk (state=3): >>><<< 11124 1726882368.48342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882368.48348: handler run complete 11124 1726882368.48537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882368.48759: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882368.48784: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882368.48811: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882368.48850: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882368.48927: variable '__install_status' from source: unknown 11124 1726882368.48941: Evaluated conditional (__install_status is success): True 11124 1726882368.48960: attempt loop complete, returning result 11124 1726882368.48963: _execute() done 11124 1726882368.48967: dumping result to json 11124 1726882368.48972: done dumping result, returning 11124 1726882368.48980: done running TaskExecutor() for managed_node1/TASK: Install dnsmasq [0e448fcc-3ce9-8362-0f62-00000000000f] 11124 1726882368.48984: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000000f 11124 1726882368.49132: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000000f 11124 1726882368.49134: WORKER PROCESS EXITING changed: [managed_node1] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.85-16.el9.x86_64" ] } 11124 1726882368.49229: no more pending results, returning what we have 11124 1726882368.49256: results queue empty 11124 1726882368.49257: checking for any_errors_fatal 11124 1726882368.49301: done checking for any_errors_fatal 11124 1726882368.49303: checking for max_fail_percentage 11124 1726882368.49304: done checking for max_fail_percentage 11124 1726882368.49305: checking to see if all hosts have failed and the running result is not ok 11124 1726882368.49306: done checking to see if all hosts have failed 11124 1726882368.49307: getting the remaining hosts for this loop 11124 1726882368.49308: done getting the remaining hosts for this loop 11124 1726882368.49311: getting the next task for host managed_node1 11124 1726882368.49315: done getting next task for host managed_node1 11124 1726882368.49318: ^ task is: TASK: Install pgrep, sysctl 11124 1726882368.49320: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882368.49346: getting variables 11124 1726882368.49347: in VariableManager get_vars() 11124 1726882368.49431: Calling all_inventory to load vars for managed_node1 11124 1726882368.49434: Calling groups_inventory to load vars for managed_node1 11124 1726882368.49436: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882368.49574: Calling all_plugins_play to load vars for managed_node1 11124 1726882368.49578: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882368.49582: Calling groups_plugins_play to load vars for managed_node1 11124 1726882368.49993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882368.50453: done with get_vars() 11124 1726882368.50467: done getting variables 11124 1726882368.50553: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:32:48 -0400 (0:00:03.661) 0:00:08.748 ****** 11124 1726882368.50586: entering _queue_task() for managed_node1/package 11124 1726882368.51850: worker is 1 (out of 1 available) 11124 1726882368.51895: exiting _queue_task() for managed_node1/package 11124 1726882368.51907: done queuing things up, now waiting for results queue to drain 11124 1726882368.51909: waiting for pending results... 11124 1726882368.52302: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 11124 1726882368.52435: in run() - task 0e448fcc-3ce9-8362-0f62-000000000010 11124 1726882368.52455: variable 'ansible_search_path' from source: unknown 11124 1726882368.52467: variable 'ansible_search_path' from source: unknown 11124 1726882368.52535: calling self._execute() 11124 1726882368.52627: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882368.52644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882368.52661: variable 'omit' from source: magic vars 11124 1726882368.53572: variable 'ansible_distribution_major_version' from source: facts 11124 1726882368.53591: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882368.53814: variable 'ansible_os_family' from source: facts 11124 1726882368.53850: Evaluated conditional (ansible_os_family == 'RedHat'): True 11124 1726882368.54216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882368.54740: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882368.54820: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882368.54883: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882368.54927: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882368.55182: variable 'ansible_distribution_major_version' from source: facts 11124 1726882368.55307: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11124 1726882368.55360: when evaluation is False, skipping this task 11124 1726882368.55372: _execute() done 11124 1726882368.55378: dumping result to json 11124 1726882368.55386: done dumping result, returning 11124 1726882368.55395: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0e448fcc-3ce9-8362-0f62-000000000010] 11124 1726882368.55406: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000010 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11124 1726882368.55568: no more pending results, returning what we have 11124 1726882368.55586: results queue empty 11124 1726882368.55587: checking for any_errors_fatal 11124 1726882368.55596: done checking for any_errors_fatal 11124 1726882368.55597: checking for max_fail_percentage 11124 1726882368.55598: done checking for max_fail_percentage 11124 1726882368.55599: checking to see if all hosts have failed and the running result is not ok 11124 1726882368.55600: done checking to see if all hosts have failed 11124 1726882368.55601: getting the remaining hosts for this loop 11124 1726882368.55602: done getting the remaining hosts for this loop 11124 1726882368.55606: getting the next task for host managed_node1 11124 1726882368.55613: done getting next task for host managed_node1 11124 1726882368.55615: ^ task is: TASK: Install pgrep, sysctl 11124 1726882368.55618: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882368.55621: getting variables 11124 1726882368.55623: in VariableManager get_vars() 11124 1726882368.55685: Calling all_inventory to load vars for managed_node1 11124 1726882368.55689: Calling groups_inventory to load vars for managed_node1 11124 1726882368.55693: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882368.55705: Calling all_plugins_play to load vars for managed_node1 11124 1726882368.55708: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882368.55711: Calling groups_plugins_play to load vars for managed_node1 11124 1726882368.55998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882368.56249: done with get_vars() 11124 1726882368.56377: done getting variables 11124 1726882368.56514: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000010 11124 1726882368.56517: WORKER PROCESS EXITING 11124 1726882368.56554: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:32:48 -0400 (0:00:00.060) 0:00:08.808 ****** 11124 1726882368.56643: entering _queue_task() for managed_node1/package 11124 1726882368.56982: worker is 1 (out of 1 available) 11124 1726882368.56993: exiting _queue_task() for managed_node1/package 11124 1726882368.57005: done queuing things up, now waiting for results queue to drain 11124 1726882368.57007: waiting for pending results... 11124 1726882368.57244: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 11124 1726882368.57409: in run() - task 0e448fcc-3ce9-8362-0f62-000000000011 11124 1726882368.57425: variable 'ansible_search_path' from source: unknown 11124 1726882368.57433: variable 'ansible_search_path' from source: unknown 11124 1726882368.57506: calling self._execute() 11124 1726882368.57607: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882368.57616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882368.57656: variable 'omit' from source: magic vars 11124 1726882368.58151: variable 'ansible_distribution_major_version' from source: facts 11124 1726882368.58178: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882368.58310: variable 'ansible_os_family' from source: facts 11124 1726882368.58321: Evaluated conditional (ansible_os_family == 'RedHat'): True 11124 1726882368.58559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882368.59056: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882368.59110: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882368.59147: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882368.59188: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882368.59294: variable 'ansible_distribution_major_version' from source: facts 11124 1726882368.59312: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11124 1726882368.59326: variable 'omit' from source: magic vars 11124 1726882368.59377: variable 'omit' from source: magic vars 11124 1726882368.59640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882368.62489: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882368.62567: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882368.62610: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882368.62649: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882368.62687: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882368.62780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882368.62817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882368.62853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882368.62904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882368.62924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882368.63031: variable '__network_is_ostree' from source: set_fact 11124 1726882368.63042: variable 'omit' from source: magic vars 11124 1726882368.63079: variable 'omit' from source: magic vars 11124 1726882368.63106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882368.63140: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882368.63168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882368.63190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882368.63204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882368.63239: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882368.63247: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882368.63255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882368.63360: Set connection var ansible_shell_executable to /bin/sh 11124 1726882368.63378: Set connection var ansible_shell_type to sh 11124 1726882368.63390: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882368.63399: Set connection var ansible_timeout to 10 11124 1726882368.63407: Set connection var ansible_pipelining to False 11124 1726882368.63413: Set connection var ansible_connection to ssh 11124 1726882368.63443: variable 'ansible_shell_executable' from source: unknown 11124 1726882368.63451: variable 'ansible_connection' from source: unknown 11124 1726882368.63458: variable 'ansible_module_compression' from source: unknown 11124 1726882368.63467: variable 'ansible_shell_type' from source: unknown 11124 1726882368.63475: variable 'ansible_shell_executable' from source: unknown 11124 1726882368.63484: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882368.63491: variable 'ansible_pipelining' from source: unknown 11124 1726882368.63498: variable 'ansible_timeout' from source: unknown 11124 1726882368.63505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882368.63616: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882368.63631: variable 'omit' from source: magic vars 11124 1726882368.63640: starting attempt loop 11124 1726882368.63647: running the handler 11124 1726882368.63664: variable 'ansible_facts' from source: unknown 11124 1726882368.63672: variable 'ansible_facts' from source: unknown 11124 1726882368.63712: _low_level_execute_command(): starting 11124 1726882368.63724: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882368.64491: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882368.64505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.64520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.64545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.64591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.64604: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882368.64618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.64637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882368.64666: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882368.64682: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882368.64695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.64710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.64726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.64739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.64751: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882368.64772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.64850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882368.64880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882368.64902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882368.65033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882368.66687: stdout chunk (state=3): >>>/root <<< 11124 1726882368.66794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882368.66880: stderr chunk (state=3): >>><<< 11124 1726882368.66899: stdout chunk (state=3): >>><<< 11124 1726882368.66971: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882368.66975: _low_level_execute_command(): starting 11124 1726882368.66978: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132 `" && echo ansible-tmp-1726882368.669274-11578-200645838748132="` echo /root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132 `" ) && sleep 0' 11124 1726882368.67639: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882368.67655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.67672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.67696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.67737: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.67754: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882368.67770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.67791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882368.67805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882368.67815: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882368.67826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.67839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.67853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.67877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.67890: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882368.67906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.67985: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882368.68011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882368.68032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882368.68156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882368.70029: stdout chunk (state=3): >>>ansible-tmp-1726882368.669274-11578-200645838748132=/root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132 <<< 11124 1726882368.70134: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882368.70230: stderr chunk (state=3): >>><<< 11124 1726882368.70241: stdout chunk (state=3): >>><<< 11124 1726882368.70575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882368.669274-11578-200645838748132=/root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882368.70579: variable 'ansible_module_compression' from source: unknown 11124 1726882368.70581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11124 1726882368.70583: variable 'ansible_facts' from source: unknown 11124 1726882368.70585: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132/AnsiballZ_dnf.py 11124 1726882368.70707: Sending initial data 11124 1726882368.70711: Sent initial data (151 bytes) 11124 1726882368.71751: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882368.71775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.71791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.71810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.71856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.71871: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882368.71896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.71916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882368.71929: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882368.71941: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882368.71953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.71968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.71985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.72004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.72015: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882368.72028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.72112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882368.72137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882368.72152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882368.72280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882368.74084: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882368.74176: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882368.74274: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpo642vwfg /root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132/AnsiballZ_dnf.py <<< 11124 1726882368.74364: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882368.75934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882368.76227: stderr chunk (state=3): >>><<< 11124 1726882368.76231: stdout chunk (state=3): >>><<< 11124 1726882368.76233: done transferring module to remote 11124 1726882368.76235: _low_level_execute_command(): starting 11124 1726882368.76238: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132/ /root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132/AnsiballZ_dnf.py && sleep 0' 11124 1726882368.76857: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882368.76873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.76887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.76912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.76954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.76967: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882368.76980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.76997: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882368.77016: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882368.77027: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882368.77039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.77051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.77073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.77086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.77096: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882368.77110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.77191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882368.77211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882368.77233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882368.77361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882368.79117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882368.79192: stderr chunk (state=3): >>><<< 11124 1726882368.79203: stdout chunk (state=3): >>><<< 11124 1726882368.79309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882368.79313: _low_level_execute_command(): starting 11124 1726882368.79318: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132/AnsiballZ_dnf.py && sleep 0' 11124 1726882368.79914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882368.79928: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.79943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.79961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.80015: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.80027: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882368.80042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.80067: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882368.80080: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882368.80095: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882368.80113: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882368.80127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882368.80143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882368.80156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882368.80171: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882368.80185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882368.80271: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882368.80293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882368.80314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882368.80458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882369.81811: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11124 1726882369.87479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882369.87482: stdout chunk (state=3): >>><<< 11124 1726882369.87485: stderr chunk (state=3): >>><<< 11124 1726882369.87632: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882369.87637: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882369.87639: _low_level_execute_command(): starting 11124 1726882369.87641: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882368.669274-11578-200645838748132/ > /dev/null 2>&1 && sleep 0' 11124 1726882369.88391: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882369.88405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882369.88429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882369.88448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882369.88506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882369.88519: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882369.88533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882369.88550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882369.88566: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882369.88599: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882369.88612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882369.88626: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882369.88642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882369.88654: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882369.88687: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882369.88710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882369.88786: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882369.88818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882369.88835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882369.88959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882369.90804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882369.91601: stderr chunk (state=3): >>><<< 11124 1726882369.91679: stdout chunk (state=3): >>><<< 11124 1726882369.91698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882369.91705: handler run complete 11124 1726882369.91748: attempt loop complete, returning result 11124 1726882369.91751: _execute() done 11124 1726882369.91756: dumping result to json 11124 1726882369.91762: done dumping result, returning 11124 1726882369.91773: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0e448fcc-3ce9-8362-0f62-000000000011] 11124 1726882369.91777: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000011 11124 1726882369.91885: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000011 11124 1726882369.91887: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11124 1726882369.92058: no more pending results, returning what we have 11124 1726882369.92062: results queue empty 11124 1726882369.92064: checking for any_errors_fatal 11124 1726882369.92074: done checking for any_errors_fatal 11124 1726882369.92075: checking for max_fail_percentage 11124 1726882369.92077: done checking for max_fail_percentage 11124 1726882369.92078: checking to see if all hosts have failed and the running result is not ok 11124 1726882369.92079: done checking to see if all hosts have failed 11124 1726882369.92080: getting the remaining hosts for this loop 11124 1726882369.92081: done getting the remaining hosts for this loop 11124 1726882369.92086: getting the next task for host managed_node1 11124 1726882369.92093: done getting next task for host managed_node1 11124 1726882369.92096: ^ task is: TASK: Create test interfaces 11124 1726882369.92100: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882369.92103: getting variables 11124 1726882369.92105: in VariableManager get_vars() 11124 1726882369.92157: Calling all_inventory to load vars for managed_node1 11124 1726882369.92160: Calling groups_inventory to load vars for managed_node1 11124 1726882369.92163: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882369.92177: Calling all_plugins_play to load vars for managed_node1 11124 1726882369.92180: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882369.92183: Calling groups_plugins_play to load vars for managed_node1 11124 1726882369.92577: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882369.92779: done with get_vars() 11124 1726882369.92789: done getting variables 11124 1726882369.92992: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:32:49 -0400 (0:00:01.363) 0:00:10.172 ****** 11124 1726882369.93023: entering _queue_task() for managed_node1/shell 11124 1726882369.93066: Creating lock for shell 11124 1726882369.93699: worker is 1 (out of 1 available) 11124 1726882369.93709: exiting _queue_task() for managed_node1/shell 11124 1726882369.93720: done queuing things up, now waiting for results queue to drain 11124 1726882369.93721: waiting for pending results... 11124 1726882369.94600: running TaskExecutor() for managed_node1/TASK: Create test interfaces 11124 1726882369.94831: in run() - task 0e448fcc-3ce9-8362-0f62-000000000012 11124 1726882369.94854: variable 'ansible_search_path' from source: unknown 11124 1726882369.94865: variable 'ansible_search_path' from source: unknown 11124 1726882369.94924: calling self._execute() 11124 1726882369.95078: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882369.95221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882369.95236: variable 'omit' from source: magic vars 11124 1726882369.95963: variable 'ansible_distribution_major_version' from source: facts 11124 1726882369.96101: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882369.96113: variable 'omit' from source: magic vars 11124 1726882369.96166: variable 'omit' from source: magic vars 11124 1726882369.97169: variable 'dhcp_interface1' from source: play vars 11124 1726882369.97187: variable 'dhcp_interface2' from source: play vars 11124 1726882369.97224: variable 'omit' from source: magic vars 11124 1726882369.97273: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882369.97512: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882369.97632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882369.97656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882369.97676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882369.97708: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882369.97724: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882369.97833: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882369.97968: Set connection var ansible_shell_executable to /bin/sh 11124 1726882369.98061: Set connection var ansible_shell_type to sh 11124 1726882369.98076: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882369.98084: Set connection var ansible_timeout to 10 11124 1726882369.98091: Set connection var ansible_pipelining to False 11124 1726882369.98095: Set connection var ansible_connection to ssh 11124 1726882369.98120: variable 'ansible_shell_executable' from source: unknown 11124 1726882369.98161: variable 'ansible_connection' from source: unknown 11124 1726882369.98174: variable 'ansible_module_compression' from source: unknown 11124 1726882369.98181: variable 'ansible_shell_type' from source: unknown 11124 1726882369.98186: variable 'ansible_shell_executable' from source: unknown 11124 1726882369.98192: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882369.98202: variable 'ansible_pipelining' from source: unknown 11124 1726882369.98271: variable 'ansible_timeout' from source: unknown 11124 1726882369.98285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882369.98544: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882369.98563: variable 'omit' from source: magic vars 11124 1726882369.98606: starting attempt loop 11124 1726882369.98615: running the handler 11124 1726882369.98630: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882369.98727: _low_level_execute_command(): starting 11124 1726882369.98740: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882370.01097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882370.01102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882370.01127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882370.01131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882370.02402: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882370.02804: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882370.02900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882370.04586: stdout chunk (state=3): >>>/root <<< 11124 1726882370.04687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882370.04773: stderr chunk (state=3): >>><<< 11124 1726882370.04776: stdout chunk (state=3): >>><<< 11124 1726882370.04903: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882370.04906: _low_level_execute_command(): starting 11124 1726882370.04909: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802 `" && echo ansible-tmp-1726882370.047985-11635-55627478627802="` echo /root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802 `" ) && sleep 0' 11124 1726882370.05918: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882370.05922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882370.05962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882370.05966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882370.05969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882370.06029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882370.07004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882370.07102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882370.08999: stdout chunk (state=3): >>>ansible-tmp-1726882370.047985-11635-55627478627802=/root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802 <<< 11124 1726882370.09108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882370.09176: stderr chunk (state=3): >>><<< 11124 1726882370.09192: stdout chunk (state=3): >>><<< 11124 1726882370.09474: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882370.047985-11635-55627478627802=/root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882370.09479: variable 'ansible_module_compression' from source: unknown 11124 1726882370.09482: ANSIBALLZ: Using generic lock for ansible.legacy.command 11124 1726882370.09484: ANSIBALLZ: Acquiring lock 11124 1726882370.09489: ANSIBALLZ: Lock acquired: 139628947188928 11124 1726882370.09492: ANSIBALLZ: Creating module 11124 1726882370.27156: ANSIBALLZ: Writing module into payload 11124 1726882370.27276: ANSIBALLZ: Writing module 11124 1726882370.27302: ANSIBALLZ: Renaming module 11124 1726882370.27319: ANSIBALLZ: Done creating module 11124 1726882370.27340: variable 'ansible_facts' from source: unknown 11124 1726882370.27418: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802/AnsiballZ_command.py 11124 1726882370.27577: Sending initial data 11124 1726882370.27581: Sent initial data (154 bytes) 11124 1726882370.28522: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882370.28537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882370.28556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882370.28578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882370.28619: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882370.28632: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882370.28647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882370.28713: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882370.28728: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882370.28741: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882370.28761: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882370.28782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882370.28799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882370.28812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882370.28824: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882370.28838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882370.28917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882370.28941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882370.28961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882370.29093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882370.30958: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11124 1726882370.30963: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882370.31050: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882370.31156: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpkh4l8c5a /root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802/AnsiballZ_command.py <<< 11124 1726882370.31270: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882370.32829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882370.33020: stderr chunk (state=3): >>><<< 11124 1726882370.33032: stdout chunk (state=3): >>><<< 11124 1726882370.33070: done transferring module to remote 11124 1726882370.33093: _low_level_execute_command(): starting 11124 1726882370.33097: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802/ /root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802/AnsiballZ_command.py && sleep 0' 11124 1726882370.34106: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882370.34114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882370.34124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882370.34137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882370.34174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882370.34181: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882370.34191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882370.34204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882370.34210: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882370.34218: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882370.34225: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882370.34235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882370.34246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882370.34253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882370.34260: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882370.34272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882370.34451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882370.34480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882370.34491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882370.34607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882370.36434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882370.36438: stdout chunk (state=3): >>><<< 11124 1726882370.36445: stderr chunk (state=3): >>><<< 11124 1726882370.36467: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882370.36470: _low_level_execute_command(): starting 11124 1726882370.36473: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802/AnsiballZ_command.py && sleep 0' 11124 1726882370.37512: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882370.37526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882370.37536: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882370.37552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882370.37595: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882370.37602: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882370.37613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882370.37628: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882370.37638: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882370.37645: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882370.37653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882370.37665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882370.37676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882370.37691: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882370.37698: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882370.37707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882370.37782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882370.37805: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882370.37817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882370.37941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882371.74974: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 618 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 618 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:32:50.507692", "end": "2024-09-20 21:32:51.747099", "delta": "0:00:01.239407", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882371.76259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882371.76265: stdout chunk (state=3): >>><<< 11124 1726882371.76272: stderr chunk (state=3): >>><<< 11124 1726882371.76304: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 618 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 618 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ grep -q 'inet [1-9]'\n+ ip addr show testbr\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:32:50.507692", "end": "2024-09-20 21:32:51.747099", "delta": "0:00:01.239407", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882371.76358: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882371.76399: _low_level_execute_command(): starting 11124 1726882371.76403: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882370.047985-11635-55627478627802/ > /dev/null 2>&1 && sleep 0' 11124 1726882371.77013: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882371.77023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882371.77033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882371.77048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882371.77089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882371.77097: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882371.77110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882371.77123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882371.77131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882371.77138: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882371.77146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882371.77155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882371.77169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882371.77176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882371.77184: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882371.77192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882371.77268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882371.77285: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882371.77292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882371.77428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882371.79270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882371.79321: stderr chunk (state=3): >>><<< 11124 1726882371.79324: stdout chunk (state=3): >>><<< 11124 1726882371.79342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882371.79351: handler run complete 11124 1726882371.79374: Evaluated conditional (False): False 11124 1726882371.79385: attempt loop complete, returning result 11124 1726882371.79388: _execute() done 11124 1726882371.79394: dumping result to json 11124 1726882371.79396: done dumping result, returning 11124 1726882371.79405: done running TaskExecutor() for managed_node1/TASK: Create test interfaces [0e448fcc-3ce9-8362-0f62-000000000012] 11124 1726882371.79409: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000012 11124 1726882371.79522: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000012 11124 1726882371.79525: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.239407", "end": "2024-09-20 21:32:51.747099", "rc": 0, "start": "2024-09-20 21:32:50.507692" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 618 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 618 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + grep -q 'inet [1-9]' + ip addr show testbr + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11124 1726882371.79602: no more pending results, returning what we have 11124 1726882371.79605: results queue empty 11124 1726882371.79606: checking for any_errors_fatal 11124 1726882371.79612: done checking for any_errors_fatal 11124 1726882371.79613: checking for max_fail_percentage 11124 1726882371.79614: done checking for max_fail_percentage 11124 1726882371.79615: checking to see if all hosts have failed and the running result is not ok 11124 1726882371.79616: done checking to see if all hosts have failed 11124 1726882371.79617: getting the remaining hosts for this loop 11124 1726882371.79618: done getting the remaining hosts for this loop 11124 1726882371.79621: getting the next task for host managed_node1 11124 1726882371.79629: done getting next task for host managed_node1 11124 1726882371.79631: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11124 1726882371.79634: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882371.79639: getting variables 11124 1726882371.79640: in VariableManager get_vars() 11124 1726882371.79687: Calling all_inventory to load vars for managed_node1 11124 1726882371.79690: Calling groups_inventory to load vars for managed_node1 11124 1726882371.79692: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882371.79701: Calling all_plugins_play to load vars for managed_node1 11124 1726882371.79703: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882371.79706: Calling groups_plugins_play to load vars for managed_node1 11124 1726882371.79918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882371.80182: done with get_vars() 11124 1726882371.80192: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:32:51 -0400 (0:00:01.879) 0:00:12.051 ****** 11124 1726882371.80931: entering _queue_task() for managed_node1/include_tasks 11124 1726882371.81226: worker is 1 (out of 1 available) 11124 1726882371.81245: exiting _queue_task() for managed_node1/include_tasks 11124 1726882371.81259: done queuing things up, now waiting for results queue to drain 11124 1726882371.81261: waiting for pending results... 11124 1726882371.81580: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11124 1726882371.81719: in run() - task 0e448fcc-3ce9-8362-0f62-000000000016 11124 1726882371.81738: variable 'ansible_search_path' from source: unknown 11124 1726882371.81747: variable 'ansible_search_path' from source: unknown 11124 1726882371.81803: calling self._execute() 11124 1726882371.81900: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882371.81914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882371.81932: variable 'omit' from source: magic vars 11124 1726882371.82338: variable 'ansible_distribution_major_version' from source: facts 11124 1726882371.82369: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882371.82380: _execute() done 11124 1726882371.82389: dumping result to json 11124 1726882371.82397: done dumping result, returning 11124 1726882371.82406: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-8362-0f62-000000000016] 11124 1726882371.82416: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000016 11124 1726882371.82557: no more pending results, returning what we have 11124 1726882371.82562: in VariableManager get_vars() 11124 1726882371.82613: Calling all_inventory to load vars for managed_node1 11124 1726882371.82616: Calling groups_inventory to load vars for managed_node1 11124 1726882371.82619: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882371.82633: Calling all_plugins_play to load vars for managed_node1 11124 1726882371.82636: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882371.82639: Calling groups_plugins_play to load vars for managed_node1 11124 1726882371.82860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882371.83084: done with get_vars() 11124 1726882371.83092: variable 'ansible_search_path' from source: unknown 11124 1726882371.83093: variable 'ansible_search_path' from source: unknown 11124 1726882371.83137: we have included files to process 11124 1726882371.83138: generating all_blocks data 11124 1726882371.83140: done generating all_blocks data 11124 1726882371.83140: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11124 1726882371.83141: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11124 1726882371.83143: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11124 1726882371.83541: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000016 11124 1726882371.83545: WORKER PROCESS EXITING 11124 1726882371.83671: done processing included file 11124 1726882371.83674: iterating over new_blocks loaded from include file 11124 1726882371.83676: in VariableManager get_vars() 11124 1726882371.83694: done with get_vars() 11124 1726882371.83696: filtering new block on tags 11124 1726882371.83713: done filtering new block on tags 11124 1726882371.83715: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11124 1726882371.83720: extending task lists for all hosts with included blocks 11124 1726882371.83945: done extending task lists 11124 1726882371.83946: done processing included files 11124 1726882371.83947: results queue empty 11124 1726882371.83947: checking for any_errors_fatal 11124 1726882371.83957: done checking for any_errors_fatal 11124 1726882371.83958: checking for max_fail_percentage 11124 1726882371.83959: done checking for max_fail_percentage 11124 1726882371.83960: checking to see if all hosts have failed and the running result is not ok 11124 1726882371.83960: done checking to see if all hosts have failed 11124 1726882371.83961: getting the remaining hosts for this loop 11124 1726882371.83962: done getting the remaining hosts for this loop 11124 1726882371.84080: getting the next task for host managed_node1 11124 1726882371.84088: done getting next task for host managed_node1 11124 1726882371.84091: ^ task is: TASK: Get stat for interface {{ interface }} 11124 1726882371.84094: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882371.84096: getting variables 11124 1726882371.84097: in VariableManager get_vars() 11124 1726882371.84110: Calling all_inventory to load vars for managed_node1 11124 1726882371.84112: Calling groups_inventory to load vars for managed_node1 11124 1726882371.84114: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882371.84120: Calling all_plugins_play to load vars for managed_node1 11124 1726882371.84122: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882371.84124: Calling groups_plugins_play to load vars for managed_node1 11124 1726882371.84488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882371.84880: done with get_vars() 11124 1726882371.84889: done getting variables 11124 1726882371.85443: variable 'interface' from source: task vars 11124 1726882371.85450: variable 'dhcp_interface1' from source: play vars 11124 1726882371.85602: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:32:51 -0400 (0:00:00.047) 0:00:12.099 ****** 11124 1726882371.85689: entering _queue_task() for managed_node1/stat 11124 1726882371.85988: worker is 1 (out of 1 available) 11124 1726882371.86001: exiting _queue_task() for managed_node1/stat 11124 1726882371.86012: done queuing things up, now waiting for results queue to drain 11124 1726882371.86013: waiting for pending results... 11124 1726882371.86310: running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 11124 1726882371.86441: in run() - task 0e448fcc-3ce9-8362-0f62-000000000153 11124 1726882371.86466: variable 'ansible_search_path' from source: unknown 11124 1726882371.86477: variable 'ansible_search_path' from source: unknown 11124 1726882371.86527: calling self._execute() 11124 1726882371.86623: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882371.86633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882371.86646: variable 'omit' from source: magic vars 11124 1726882371.87042: variable 'ansible_distribution_major_version' from source: facts 11124 1726882371.87066: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882371.87082: variable 'omit' from source: magic vars 11124 1726882371.87146: variable 'omit' from source: magic vars 11124 1726882371.87257: variable 'interface' from source: task vars 11124 1726882371.87276: variable 'dhcp_interface1' from source: play vars 11124 1726882371.87340: variable 'dhcp_interface1' from source: play vars 11124 1726882371.87373: variable 'omit' from source: magic vars 11124 1726882371.87421: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882371.87469: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882371.87516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882371.87539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882371.87561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882371.87615: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882371.87625: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882371.87632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882371.87762: Set connection var ansible_shell_executable to /bin/sh 11124 1726882371.87781: Set connection var ansible_shell_type to sh 11124 1726882371.87808: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882371.87824: Set connection var ansible_timeout to 10 11124 1726882371.87834: Set connection var ansible_pipelining to False 11124 1726882371.87842: Set connection var ansible_connection to ssh 11124 1726882371.87875: variable 'ansible_shell_executable' from source: unknown 11124 1726882371.87883: variable 'ansible_connection' from source: unknown 11124 1726882371.87890: variable 'ansible_module_compression' from source: unknown 11124 1726882371.87898: variable 'ansible_shell_type' from source: unknown 11124 1726882371.87913: variable 'ansible_shell_executable' from source: unknown 11124 1726882371.87920: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882371.87928: variable 'ansible_pipelining' from source: unknown 11124 1726882371.87934: variable 'ansible_timeout' from source: unknown 11124 1726882371.87941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882371.88162: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882371.88181: variable 'omit' from source: magic vars 11124 1726882371.88190: starting attempt loop 11124 1726882371.88196: running the handler 11124 1726882371.88217: _low_level_execute_command(): starting 11124 1726882371.88232: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882371.89065: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882371.89083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882371.89100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882371.89130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882371.89178: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882371.89192: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882371.89207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882371.89235: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882371.89254: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882371.89269: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882371.89283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882371.89301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882371.89316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882371.89327: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882371.89343: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882371.89368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882371.89450: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882371.89488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882371.89507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882371.89689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882371.91305: stdout chunk (state=3): >>>/root <<< 11124 1726882371.91473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882371.91504: stdout chunk (state=3): >>><<< 11124 1726882371.91507: stderr chunk (state=3): >>><<< 11124 1726882371.91641: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882371.91645: _low_level_execute_command(): starting 11124 1726882371.91651: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967 `" && echo ansible-tmp-1726882371.9153173-11725-137933984572967="` echo /root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967 `" ) && sleep 0' 11124 1726882371.92431: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882371.92435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882371.92471: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882371.92475: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882371.92478: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882371.92548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882371.92557: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882371.92659: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882371.94524: stdout chunk (state=3): >>>ansible-tmp-1726882371.9153173-11725-137933984572967=/root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967 <<< 11124 1726882371.94639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882371.94707: stderr chunk (state=3): >>><<< 11124 1726882371.94710: stdout chunk (state=3): >>><<< 11124 1726882371.94989: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882371.9153173-11725-137933984572967=/root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882371.94993: variable 'ansible_module_compression' from source: unknown 11124 1726882371.94995: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11124 1726882371.94997: variable 'ansible_facts' from source: unknown 11124 1726882371.95005: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967/AnsiballZ_stat.py 11124 1726882371.95173: Sending initial data 11124 1726882371.95176: Sent initial data (153 bytes) 11124 1726882371.96204: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882371.96216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882371.96229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882371.96244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882371.96297: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882371.96307: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882371.96318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882371.96331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882371.96340: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882371.96351: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882371.96360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882371.96374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882371.96395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882371.96411: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882371.96423: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882371.96437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882371.96527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882371.96544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882371.96562: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882371.96689: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882371.98994: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882371.99093: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882371.99194: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp1ap91kkj /root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967/AnsiballZ_stat.py <<< 11124 1726882371.99289: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882372.00906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882372.01079: stderr chunk (state=3): >>><<< 11124 1726882372.01082: stdout chunk (state=3): >>><<< 11124 1726882372.01103: done transferring module to remote 11124 1726882372.01115: _low_level_execute_command(): starting 11124 1726882372.01118: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967/ /root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967/AnsiballZ_stat.py && sleep 0' 11124 1726882372.01825: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882372.01834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.01845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.01860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.01902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.01909: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882372.01919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.01933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882372.01941: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882372.01947: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882372.01955: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.01967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.01981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.01988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.01996: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882372.02003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.02081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882372.02091: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882372.02104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882372.02241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882372.04043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882372.04170: stderr chunk (state=3): >>><<< 11124 1726882372.04191: stdout chunk (state=3): >>><<< 11124 1726882372.04295: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882372.04299: _low_level_execute_command(): starting 11124 1726882372.04301: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967/AnsiballZ_stat.py && sleep 0' 11124 1726882372.05109: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882372.05133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.05167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.05194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.05268: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.05285: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882372.05304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.05327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882372.05340: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882372.05356: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882372.05372: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.05387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.05403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.05423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.05446: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882372.05467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.05580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882372.05609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882372.05634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882372.05792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882372.18927: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25004, "dev": 21, "nlink": 1, "atime": 1726882370.5159075, "mtime": 1726882370.5159075, "ctime": 1726882370.5159075, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11124 1726882372.20021: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882372.20026: stdout chunk (state=3): >>><<< 11124 1726882372.20028: stderr chunk (state=3): >>><<< 11124 1726882372.20070: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25004, "dev": 21, "nlink": 1, "atime": 1726882370.5159075, "mtime": 1726882370.5159075, "ctime": 1726882370.5159075, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882372.20176: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882372.20180: _low_level_execute_command(): starting 11124 1726882372.20182: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882371.9153173-11725-137933984572967/ > /dev/null 2>&1 && sleep 0' 11124 1726882372.21565: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882372.21579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.21593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.21634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.21690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.21701: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882372.21722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.21737: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882372.21746: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882372.21765: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882372.21805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.21828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.21871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.21906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.21951: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882372.21988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.22131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882372.22166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882372.22199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882372.22335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882372.24246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882372.24253: stdout chunk (state=3): >>><<< 11124 1726882372.24255: stderr chunk (state=3): >>><<< 11124 1726882372.24572: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882372.24576: handler run complete 11124 1726882372.24578: attempt loop complete, returning result 11124 1726882372.24580: _execute() done 11124 1726882372.24582: dumping result to json 11124 1726882372.24584: done dumping result, returning 11124 1726882372.24586: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 [0e448fcc-3ce9-8362-0f62-000000000153] 11124 1726882372.24588: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000153 11124 1726882372.24670: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000153 11124 1726882372.24673: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882370.5159075, "block_size": 4096, "blocks": 0, "ctime": 1726882370.5159075, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25004, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882370.5159075, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11124 1726882372.24774: no more pending results, returning what we have 11124 1726882372.24777: results queue empty 11124 1726882372.24779: checking for any_errors_fatal 11124 1726882372.24780: done checking for any_errors_fatal 11124 1726882372.24781: checking for max_fail_percentage 11124 1726882372.24783: done checking for max_fail_percentage 11124 1726882372.24783: checking to see if all hosts have failed and the running result is not ok 11124 1726882372.24785: done checking to see if all hosts have failed 11124 1726882372.24786: getting the remaining hosts for this loop 11124 1726882372.24787: done getting the remaining hosts for this loop 11124 1726882372.24791: getting the next task for host managed_node1 11124 1726882372.24797: done getting next task for host managed_node1 11124 1726882372.24800: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11124 1726882372.24803: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882372.24808: getting variables 11124 1726882372.24810: in VariableManager get_vars() 11124 1726882372.24857: Calling all_inventory to load vars for managed_node1 11124 1726882372.24866: Calling groups_inventory to load vars for managed_node1 11124 1726882372.24869: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.24882: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.24885: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.24888: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.25063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.25366: done with get_vars() 11124 1726882372.25382: done getting variables 11124 1726882372.25473: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11124 1726882372.25601: variable 'interface' from source: task vars 11124 1726882372.25605: variable 'dhcp_interface1' from source: play vars 11124 1726882372.25694: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:32:52 -0400 (0:00:00.400) 0:00:12.499 ****** 11124 1726882372.25726: entering _queue_task() for managed_node1/assert 11124 1726882372.25728: Creating lock for assert 11124 1726882372.25979: worker is 1 (out of 1 available) 11124 1726882372.25991: exiting _queue_task() for managed_node1/assert 11124 1726882372.26002: done queuing things up, now waiting for results queue to drain 11124 1726882372.26004: waiting for pending results... 11124 1726882372.26269: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' 11124 1726882372.26389: in run() - task 0e448fcc-3ce9-8362-0f62-000000000017 11124 1726882372.26408: variable 'ansible_search_path' from source: unknown 11124 1726882372.26415: variable 'ansible_search_path' from source: unknown 11124 1726882372.26518: calling self._execute() 11124 1726882372.26755: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.26770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.26791: variable 'omit' from source: magic vars 11124 1726882372.27203: variable 'ansible_distribution_major_version' from source: facts 11124 1726882372.27228: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882372.27238: variable 'omit' from source: magic vars 11124 1726882372.27293: variable 'omit' from source: magic vars 11124 1726882372.27404: variable 'interface' from source: task vars 11124 1726882372.27415: variable 'dhcp_interface1' from source: play vars 11124 1726882372.27497: variable 'dhcp_interface1' from source: play vars 11124 1726882372.27520: variable 'omit' from source: magic vars 11124 1726882372.27576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882372.27614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882372.27639: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882372.27676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882372.27695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882372.27727: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882372.27736: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.27745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.27860: Set connection var ansible_shell_executable to /bin/sh 11124 1726882372.27882: Set connection var ansible_shell_type to sh 11124 1726882372.27897: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882372.27910: Set connection var ansible_timeout to 10 11124 1726882372.27918: Set connection var ansible_pipelining to False 11124 1726882372.27925: Set connection var ansible_connection to ssh 11124 1726882372.27951: variable 'ansible_shell_executable' from source: unknown 11124 1726882372.27960: variable 'ansible_connection' from source: unknown 11124 1726882372.27968: variable 'ansible_module_compression' from source: unknown 11124 1726882372.27982: variable 'ansible_shell_type' from source: unknown 11124 1726882372.27990: variable 'ansible_shell_executable' from source: unknown 11124 1726882372.27997: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.28007: variable 'ansible_pipelining' from source: unknown 11124 1726882372.28020: variable 'ansible_timeout' from source: unknown 11124 1726882372.28029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.28188: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882372.28210: variable 'omit' from source: magic vars 11124 1726882372.28219: starting attempt loop 11124 1726882372.28225: running the handler 11124 1726882372.28378: variable 'interface_stat' from source: set_fact 11124 1726882372.28400: Evaluated conditional (interface_stat.stat.exists): True 11124 1726882372.28417: handler run complete 11124 1726882372.28437: attempt loop complete, returning result 11124 1726882372.28444: _execute() done 11124 1726882372.28453: dumping result to json 11124 1726882372.28468: done dumping result, returning 11124 1726882372.28479: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' [0e448fcc-3ce9-8362-0f62-000000000017] 11124 1726882372.28489: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000017 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882372.28633: no more pending results, returning what we have 11124 1726882372.28636: results queue empty 11124 1726882372.28637: checking for any_errors_fatal 11124 1726882372.28646: done checking for any_errors_fatal 11124 1726882372.28647: checking for max_fail_percentage 11124 1726882372.28651: done checking for max_fail_percentage 11124 1726882372.28652: checking to see if all hosts have failed and the running result is not ok 11124 1726882372.28654: done checking to see if all hosts have failed 11124 1726882372.28655: getting the remaining hosts for this loop 11124 1726882372.28656: done getting the remaining hosts for this loop 11124 1726882372.28660: getting the next task for host managed_node1 11124 1726882372.28674: done getting next task for host managed_node1 11124 1726882372.28676: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11124 1726882372.28680: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882372.28683: getting variables 11124 1726882372.28685: in VariableManager get_vars() 11124 1726882372.28729: Calling all_inventory to load vars for managed_node1 11124 1726882372.28732: Calling groups_inventory to load vars for managed_node1 11124 1726882372.28735: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.28746: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.28752: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.28755: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.29235: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.29726: done with get_vars() 11124 1726882372.29736: done getting variables 11124 1726882372.29970: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000017 11124 1726882372.29973: WORKER PROCESS EXITING TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:32:52 -0400 (0:00:00.042) 0:00:12.542 ****** 11124 1726882372.30009: entering _queue_task() for managed_node1/include_tasks 11124 1726882372.30669: worker is 1 (out of 1 available) 11124 1726882372.30681: exiting _queue_task() for managed_node1/include_tasks 11124 1726882372.30694: done queuing things up, now waiting for results queue to drain 11124 1726882372.30695: waiting for pending results... 11124 1726882372.30954: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11124 1726882372.31086: in run() - task 0e448fcc-3ce9-8362-0f62-00000000001b 11124 1726882372.31106: variable 'ansible_search_path' from source: unknown 11124 1726882372.31114: variable 'ansible_search_path' from source: unknown 11124 1726882372.31168: calling self._execute() 11124 1726882372.31256: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.31271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.31284: variable 'omit' from source: magic vars 11124 1726882372.31652: variable 'ansible_distribution_major_version' from source: facts 11124 1726882372.31671: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882372.31689: _execute() done 11124 1726882372.31698: dumping result to json 11124 1726882372.31706: done dumping result, returning 11124 1726882372.31715: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-8362-0f62-00000000001b] 11124 1726882372.31724: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000001b 11124 1726882372.31847: no more pending results, returning what we have 11124 1726882372.31855: in VariableManager get_vars() 11124 1726882372.31911: Calling all_inventory to load vars for managed_node1 11124 1726882372.31914: Calling groups_inventory to load vars for managed_node1 11124 1726882372.31916: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.31931: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.31934: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.31937: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.32133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.32391: done with get_vars() 11124 1726882372.32398: variable 'ansible_search_path' from source: unknown 11124 1726882372.32399: variable 'ansible_search_path' from source: unknown 11124 1726882372.32437: we have included files to process 11124 1726882372.32438: generating all_blocks data 11124 1726882372.32440: done generating all_blocks data 11124 1726882372.32444: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11124 1726882372.32445: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11124 1726882372.32447: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11124 1726882372.32734: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000001b 11124 1726882372.32737: WORKER PROCESS EXITING 11124 1726882372.32890: done processing included file 11124 1726882372.32892: iterating over new_blocks loaded from include file 11124 1726882372.32894: in VariableManager get_vars() 11124 1726882372.32912: done with get_vars() 11124 1726882372.32914: filtering new block on tags 11124 1726882372.32936: done filtering new block on tags 11124 1726882372.32938: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11124 1726882372.32943: extending task lists for all hosts with included blocks 11124 1726882372.33054: done extending task lists 11124 1726882372.33055: done processing included files 11124 1726882372.33056: results queue empty 11124 1726882372.33057: checking for any_errors_fatal 11124 1726882372.33061: done checking for any_errors_fatal 11124 1726882372.33061: checking for max_fail_percentage 11124 1726882372.33062: done checking for max_fail_percentage 11124 1726882372.33069: checking to see if all hosts have failed and the running result is not ok 11124 1726882372.33070: done checking to see if all hosts have failed 11124 1726882372.33071: getting the remaining hosts for this loop 11124 1726882372.33072: done getting the remaining hosts for this loop 11124 1726882372.33075: getting the next task for host managed_node1 11124 1726882372.33078: done getting next task for host managed_node1 11124 1726882372.33080: ^ task is: TASK: Get stat for interface {{ interface }} 11124 1726882372.33083: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882372.33085: getting variables 11124 1726882372.33085: in VariableManager get_vars() 11124 1726882372.33097: Calling all_inventory to load vars for managed_node1 11124 1726882372.33099: Calling groups_inventory to load vars for managed_node1 11124 1726882372.33101: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.33105: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.33107: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.33110: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.33256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.33456: done with get_vars() 11124 1726882372.33469: done getting variables 11124 1726882372.33615: variable 'interface' from source: task vars 11124 1726882372.33619: variable 'dhcp_interface2' from source: play vars 11124 1726882372.33679: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:32:52 -0400 (0:00:00.037) 0:00:12.579 ****** 11124 1726882372.33712: entering _queue_task() for managed_node1/stat 11124 1726882372.33930: worker is 1 (out of 1 available) 11124 1726882372.33942: exiting _queue_task() for managed_node1/stat 11124 1726882372.33955: done queuing things up, now waiting for results queue to drain 11124 1726882372.33957: waiting for pending results... 11124 1726882372.34215: running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 11124 1726882372.34354: in run() - task 0e448fcc-3ce9-8362-0f62-00000000016b 11124 1726882372.34376: variable 'ansible_search_path' from source: unknown 11124 1726882372.34385: variable 'ansible_search_path' from source: unknown 11124 1726882372.34428: calling self._execute() 11124 1726882372.34514: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.34524: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.34539: variable 'omit' from source: magic vars 11124 1726882372.34890: variable 'ansible_distribution_major_version' from source: facts 11124 1726882372.34907: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882372.34918: variable 'omit' from source: magic vars 11124 1726882372.34985: variable 'omit' from source: magic vars 11124 1726882372.35085: variable 'interface' from source: task vars 11124 1726882372.35097: variable 'dhcp_interface2' from source: play vars 11124 1726882372.35173: variable 'dhcp_interface2' from source: play vars 11124 1726882372.35197: variable 'omit' from source: magic vars 11124 1726882372.35245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882372.35292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882372.35318: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882372.35341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882372.35359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882372.35396: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882372.35404: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.35411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.35522: Set connection var ansible_shell_executable to /bin/sh 11124 1726882372.35537: Set connection var ansible_shell_type to sh 11124 1726882372.35554: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882372.35564: Set connection var ansible_timeout to 10 11124 1726882372.35574: Set connection var ansible_pipelining to False 11124 1726882372.35581: Set connection var ansible_connection to ssh 11124 1726882372.35607: variable 'ansible_shell_executable' from source: unknown 11124 1726882372.35614: variable 'ansible_connection' from source: unknown 11124 1726882372.35620: variable 'ansible_module_compression' from source: unknown 11124 1726882372.35626: variable 'ansible_shell_type' from source: unknown 11124 1726882372.35632: variable 'ansible_shell_executable' from source: unknown 11124 1726882372.35638: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.35647: variable 'ansible_pipelining' from source: unknown 11124 1726882372.35661: variable 'ansible_timeout' from source: unknown 11124 1726882372.35672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.35937: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882372.35954: variable 'omit' from source: magic vars 11124 1726882372.35963: starting attempt loop 11124 1726882372.35971: running the handler 11124 1726882372.35992: _low_level_execute_command(): starting 11124 1726882372.36003: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882372.36807: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882372.36823: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.36839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.36867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.36916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.36930: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882372.36944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.36969: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882372.36986: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882372.37002: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882372.37015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.37030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.37046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.37066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.37086: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882372.37102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.37186: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882372.37216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882372.37237: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882372.37371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882372.39035: stdout chunk (state=3): >>>/root <<< 11124 1726882372.39232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882372.39235: stdout chunk (state=3): >>><<< 11124 1726882372.39238: stderr chunk (state=3): >>><<< 11124 1726882372.39271: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882372.39371: _low_level_execute_command(): starting 11124 1726882372.39375: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812 `" && echo ansible-tmp-1726882372.3927732-11752-2071911543812="` echo /root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812 `" ) && sleep 0' 11124 1726882372.40282: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882372.40296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.40320: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.40339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.40387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.40400: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882372.40420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.40437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882372.40451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882372.40461: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882372.40475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.40486: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.40500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.40512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.40533: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882372.40551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.40634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882372.40669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882372.40687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882372.40816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882372.42681: stdout chunk (state=3): >>>ansible-tmp-1726882372.3927732-11752-2071911543812=/root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812 <<< 11124 1726882372.42791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882372.42884: stderr chunk (state=3): >>><<< 11124 1726882372.42897: stdout chunk (state=3): >>><<< 11124 1726882372.42969: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882372.3927732-11752-2071911543812=/root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882372.43170: variable 'ansible_module_compression' from source: unknown 11124 1726882372.43173: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11124 1726882372.43175: variable 'ansible_facts' from source: unknown 11124 1726882372.43193: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812/AnsiballZ_stat.py 11124 1726882372.43357: Sending initial data 11124 1726882372.43360: Sent initial data (151 bytes) 11124 1726882372.44658: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882372.44676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.44692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.44712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.44767: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.44784: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882372.44800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.44818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882372.44830: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882372.44840: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882372.44856: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.44883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.44900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.44913: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.44925: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882372.44938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.45028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882372.45053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882372.45075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882372.45217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882372.46977: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882372.47056: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882372.47153: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp6k29h48b /root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812/AnsiballZ_stat.py <<< 11124 1726882372.47237: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882372.48538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882372.48691: stderr chunk (state=3): >>><<< 11124 1726882372.48695: stdout chunk (state=3): >>><<< 11124 1726882372.48716: done transferring module to remote 11124 1726882372.48728: _low_level_execute_command(): starting 11124 1726882372.48734: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812/ /root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812/AnsiballZ_stat.py && sleep 0' 11124 1726882372.49401: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882372.49411: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.49419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.49434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.49479: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.49486: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882372.49497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.49510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882372.49519: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882372.49523: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882372.49531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.49541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.49554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.49560: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.49570: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882372.49582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.49653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882372.49673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882372.49685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882372.49801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882372.51825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882372.52131: stderr chunk (state=3): >>><<< 11124 1726882372.52134: stdout chunk (state=3): >>><<< 11124 1726882372.52154: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882372.52157: _low_level_execute_command(): starting 11124 1726882372.52160: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812/AnsiballZ_stat.py && sleep 0' 11124 1726882372.53210: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882372.53218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.53229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.53244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.53369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.53383: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882372.53386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.53412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882372.53418: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882372.53425: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882372.53434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882372.53443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882372.53454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882372.53465: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882372.53472: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882372.53481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.53548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882372.53570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882372.53582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882372.53722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882372.66902: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25590, "dev": 21, "nlink": 1, "atime": 1726882370.5236185, "mtime": 1726882370.5236185, "ctime": 1726882370.5236185, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11124 1726882372.67974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882372.67994: stdout chunk (state=3): >>><<< 11124 1726882372.67997: stderr chunk (state=3): >>><<< 11124 1726882372.68161: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 25590, "dev": 21, "nlink": 1, "atime": 1726882370.5236185, "mtime": 1726882370.5236185, "ctime": 1726882370.5236185, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882372.68173: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882372.68177: _low_level_execute_command(): starting 11124 1726882372.68179: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882372.3927732-11752-2071911543812/ > /dev/null 2>&1 && sleep 0' 11124 1726882372.69368: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882372.69713: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882372.69731: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882372.69745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882372.69872: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882372.71733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882372.71737: stdout chunk (state=3): >>><<< 11124 1726882372.71745: stderr chunk (state=3): >>><<< 11124 1726882372.71762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882372.71770: handler run complete 11124 1726882372.71823: attempt loop complete, returning result 11124 1726882372.71826: _execute() done 11124 1726882372.71828: dumping result to json 11124 1726882372.71834: done dumping result, returning 11124 1726882372.71843: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 [0e448fcc-3ce9-8362-0f62-00000000016b] 11124 1726882372.71851: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000016b 11124 1726882372.71962: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000016b 11124 1726882372.71965: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882370.5236185, "block_size": 4096, "blocks": 0, "ctime": 1726882370.5236185, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 25590, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882370.5236185, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11124 1726882372.72076: no more pending results, returning what we have 11124 1726882372.72079: results queue empty 11124 1726882372.72080: checking for any_errors_fatal 11124 1726882372.72081: done checking for any_errors_fatal 11124 1726882372.72082: checking for max_fail_percentage 11124 1726882372.72083: done checking for max_fail_percentage 11124 1726882372.72084: checking to see if all hosts have failed and the running result is not ok 11124 1726882372.72085: done checking to see if all hosts have failed 11124 1726882372.72086: getting the remaining hosts for this loop 11124 1726882372.72087: done getting the remaining hosts for this loop 11124 1726882372.72090: getting the next task for host managed_node1 11124 1726882372.72097: done getting next task for host managed_node1 11124 1726882372.72099: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11124 1726882372.72102: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882372.72106: getting variables 11124 1726882372.72107: in VariableManager get_vars() 11124 1726882372.72141: Calling all_inventory to load vars for managed_node1 11124 1726882372.72144: Calling groups_inventory to load vars for managed_node1 11124 1726882372.72146: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.72159: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.72162: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.72172: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.72401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.72618: done with get_vars() 11124 1726882372.72627: done getting variables 11124 1726882372.72847: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882372.73010: variable 'interface' from source: task vars 11124 1726882372.73013: variable 'dhcp_interface2' from source: play vars 11124 1726882372.73192: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:32:52 -0400 (0:00:00.395) 0:00:12.974 ****** 11124 1726882372.73222: entering _queue_task() for managed_node1/assert 11124 1726882372.73939: worker is 1 (out of 1 available) 11124 1726882372.73954: exiting _queue_task() for managed_node1/assert 11124 1726882372.73968: done queuing things up, now waiting for results queue to drain 11124 1726882372.73970: waiting for pending results... 11124 1726882372.74947: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' 11124 1726882372.75190: in run() - task 0e448fcc-3ce9-8362-0f62-00000000001c 11124 1726882372.75210: variable 'ansible_search_path' from source: unknown 11124 1726882372.75220: variable 'ansible_search_path' from source: unknown 11124 1726882372.75274: calling self._execute() 11124 1726882372.75422: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.75565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.75582: variable 'omit' from source: magic vars 11124 1726882372.76459: variable 'ansible_distribution_major_version' from source: facts 11124 1726882372.76482: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882372.76508: variable 'omit' from source: magic vars 11124 1726882372.76598: variable 'omit' from source: magic vars 11124 1726882372.77719: variable 'interface' from source: task vars 11124 1726882372.77792: variable 'dhcp_interface2' from source: play vars 11124 1726882372.77874: variable 'dhcp_interface2' from source: play vars 11124 1726882372.77897: variable 'omit' from source: magic vars 11124 1726882372.77951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882372.77998: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882372.78023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882372.78047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882372.78079: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882372.78113: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882372.78121: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.78127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.78262: Set connection var ansible_shell_executable to /bin/sh 11124 1726882372.78289: Set connection var ansible_shell_type to sh 11124 1726882372.78302: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882372.78311: Set connection var ansible_timeout to 10 11124 1726882372.78320: Set connection var ansible_pipelining to False 11124 1726882372.78326: Set connection var ansible_connection to ssh 11124 1726882372.78358: variable 'ansible_shell_executable' from source: unknown 11124 1726882372.78367: variable 'ansible_connection' from source: unknown 11124 1726882372.78376: variable 'ansible_module_compression' from source: unknown 11124 1726882372.78387: variable 'ansible_shell_type' from source: unknown 11124 1726882372.78399: variable 'ansible_shell_executable' from source: unknown 11124 1726882372.78406: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.78413: variable 'ansible_pipelining' from source: unknown 11124 1726882372.78420: variable 'ansible_timeout' from source: unknown 11124 1726882372.78427: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.78641: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882372.78665: variable 'omit' from source: magic vars 11124 1726882372.78679: starting attempt loop 11124 1726882372.78686: running the handler 11124 1726882372.78862: variable 'interface_stat' from source: set_fact 11124 1726882372.78892: Evaluated conditional (interface_stat.stat.exists): True 11124 1726882372.78904: handler run complete 11124 1726882372.78921: attempt loop complete, returning result 11124 1726882372.78933: _execute() done 11124 1726882372.78944: dumping result to json 11124 1726882372.78953: done dumping result, returning 11124 1726882372.78965: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' [0e448fcc-3ce9-8362-0f62-00000000001c] 11124 1726882372.78973: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000001c ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882372.79130: no more pending results, returning what we have 11124 1726882372.79137: results queue empty 11124 1726882372.79139: checking for any_errors_fatal 11124 1726882372.79147: done checking for any_errors_fatal 11124 1726882372.79147: checking for max_fail_percentage 11124 1726882372.79150: done checking for max_fail_percentage 11124 1726882372.79151: checking to see if all hosts have failed and the running result is not ok 11124 1726882372.79153: done checking to see if all hosts have failed 11124 1726882372.79154: getting the remaining hosts for this loop 11124 1726882372.79155: done getting the remaining hosts for this loop 11124 1726882372.79159: getting the next task for host managed_node1 11124 1726882372.79170: done getting next task for host managed_node1 11124 1726882372.79174: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 11124 1726882372.79176: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882372.79179: getting variables 11124 1726882372.79181: in VariableManager get_vars() 11124 1726882372.79228: Calling all_inventory to load vars for managed_node1 11124 1726882372.79231: Calling groups_inventory to load vars for managed_node1 11124 1726882372.79234: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.79246: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.79251: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.79254: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.79458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.79701: done with get_vars() 11124 1726882372.79714: done getting variables 11124 1726882372.79778: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882372.80017: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000001c 11124 1726882372.80022: WORKER PROCESS EXITING TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:28 Friday 20 September 2024 21:32:52 -0400 (0:00:00.068) 0:00:13.042 ****** 11124 1726882372.80037: entering _queue_task() for managed_node1/command 11124 1726882372.80502: worker is 1 (out of 1 available) 11124 1726882372.80513: exiting _queue_task() for managed_node1/command 11124 1726882372.80524: done queuing things up, now waiting for results queue to drain 11124 1726882372.80526: waiting for pending results... 11124 1726882372.81634: running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript 11124 1726882372.81893: in run() - task 0e448fcc-3ce9-8362-0f62-00000000001d 11124 1726882372.81917: variable 'ansible_search_path' from source: unknown 11124 1726882372.81968: calling self._execute() 11124 1726882372.82095: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.82112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.82131: variable 'omit' from source: magic vars 11124 1726882372.82627: variable 'ansible_distribution_major_version' from source: facts 11124 1726882372.82665: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882372.82801: variable 'network_provider' from source: set_fact 11124 1726882372.82813: Evaluated conditional (network_provider == "initscripts"): False 11124 1726882372.82825: when evaluation is False, skipping this task 11124 1726882372.82834: _execute() done 11124 1726882372.82841: dumping result to json 11124 1726882372.82856: done dumping result, returning 11124 1726882372.82874: done running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript [0e448fcc-3ce9-8362-0f62-00000000001d] 11124 1726882372.82889: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000001d skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11124 1726882372.83037: no more pending results, returning what we have 11124 1726882372.83041: results queue empty 11124 1726882372.83042: checking for any_errors_fatal 11124 1726882372.83050: done checking for any_errors_fatal 11124 1726882372.83051: checking for max_fail_percentage 11124 1726882372.83053: done checking for max_fail_percentage 11124 1726882372.83054: checking to see if all hosts have failed and the running result is not ok 11124 1726882372.83055: done checking to see if all hosts have failed 11124 1726882372.83055: getting the remaining hosts for this loop 11124 1726882372.83057: done getting the remaining hosts for this loop 11124 1726882372.83061: getting the next task for host managed_node1 11124 1726882372.83068: done getting next task for host managed_node1 11124 1726882372.83071: ^ task is: TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 11124 1726882372.83074: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882372.83077: getting variables 11124 1726882372.83079: in VariableManager get_vars() 11124 1726882372.83123: Calling all_inventory to load vars for managed_node1 11124 1726882372.83126: Calling groups_inventory to load vars for managed_node1 11124 1726882372.83128: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.83142: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.83145: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.83147: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.83386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.83618: done with get_vars() 11124 1726882372.83628: done getting variables 11124 1726882372.83812: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000001d 11124 1726882372.83815: WORKER PROCESS EXITING 11124 1726882372.83832: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports using deprecated 'master' argument] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:33 Friday 20 September 2024 21:32:52 -0400 (0:00:00.038) 0:00:13.082 ****** 11124 1726882372.84574: entering _queue_task() for managed_node1/debug 11124 1726882372.84947: worker is 1 (out of 1 available) 11124 1726882372.84961: exiting _queue_task() for managed_node1/debug 11124 1726882372.84975: done queuing things up, now waiting for results queue to drain 11124 1726882372.84977: waiting for pending results... 11124 1726882372.85810: running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 11124 1726882372.85916: in run() - task 0e448fcc-3ce9-8362-0f62-00000000001e 11124 1726882372.85941: variable 'ansible_search_path' from source: unknown 11124 1726882372.85989: calling self._execute() 11124 1726882372.86088: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.86098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.86121: variable 'omit' from source: magic vars 11124 1726882372.86915: variable 'ansible_distribution_major_version' from source: facts 11124 1726882372.86932: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882372.86943: variable 'omit' from source: magic vars 11124 1726882372.86994: variable 'omit' from source: magic vars 11124 1726882372.87047: variable 'omit' from source: magic vars 11124 1726882372.87211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882372.87251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882372.87362: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882372.87390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882372.87504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882372.87613: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882372.87733: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.87866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.88076: Set connection var ansible_shell_executable to /bin/sh 11124 1726882372.88098: Set connection var ansible_shell_type to sh 11124 1726882372.88116: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882372.88130: Set connection var ansible_timeout to 10 11124 1726882372.88140: Set connection var ansible_pipelining to False 11124 1726882372.88147: Set connection var ansible_connection to ssh 11124 1726882372.88218: variable 'ansible_shell_executable' from source: unknown 11124 1726882372.88228: variable 'ansible_connection' from source: unknown 11124 1726882372.88237: variable 'ansible_module_compression' from source: unknown 11124 1726882372.88245: variable 'ansible_shell_type' from source: unknown 11124 1726882372.88255: variable 'ansible_shell_executable' from source: unknown 11124 1726882372.88265: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.88274: variable 'ansible_pipelining' from source: unknown 11124 1726882372.88281: variable 'ansible_timeout' from source: unknown 11124 1726882372.88294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.88473: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882372.88490: variable 'omit' from source: magic vars 11124 1726882372.88502: starting attempt loop 11124 1726882372.88512: running the handler 11124 1726882372.88578: handler run complete 11124 1726882372.88603: attempt loop complete, returning result 11124 1726882372.88610: _execute() done 11124 1726882372.88622: dumping result to json 11124 1726882372.88629: done dumping result, returning 11124 1726882372.88643: done running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument [0e448fcc-3ce9-8362-0f62-00000000001e] 11124 1726882372.88659: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000001e ok: [managed_node1] => {} MSG: ################################################## 11124 1726882372.88822: no more pending results, returning what we have 11124 1726882372.88826: results queue empty 11124 1726882372.88827: checking for any_errors_fatal 11124 1726882372.88832: done checking for any_errors_fatal 11124 1726882372.88833: checking for max_fail_percentage 11124 1726882372.88834: done checking for max_fail_percentage 11124 1726882372.88835: checking to see if all hosts have failed and the running result is not ok 11124 1726882372.88837: done checking to see if all hosts have failed 11124 1726882372.88837: getting the remaining hosts for this loop 11124 1726882372.88839: done getting the remaining hosts for this loop 11124 1726882372.88842: getting the next task for host managed_node1 11124 1726882372.88852: done getting next task for host managed_node1 11124 1726882372.88859: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11124 1726882372.88865: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882372.88884: getting variables 11124 1726882372.88887: in VariableManager get_vars() 11124 1726882372.88935: Calling all_inventory to load vars for managed_node1 11124 1726882372.88938: Calling groups_inventory to load vars for managed_node1 11124 1726882372.88941: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.88956: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.88959: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.88962: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.89152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.89376: done with get_vars() 11124 1726882372.89394: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:32:52 -0400 (0:00:00.050) 0:00:13.138 ****** 11124 1726882372.89623: entering _queue_task() for managed_node1/include_tasks 11124 1726882372.89719: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000001e 11124 1726882372.89723: WORKER PROCESS EXITING 11124 1726882372.90198: worker is 1 (out of 1 available) 11124 1726882372.90209: exiting _queue_task() for managed_node1/include_tasks 11124 1726882372.90220: done queuing things up, now waiting for results queue to drain 11124 1726882372.90221: waiting for pending results... 11124 1726882372.90563: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11124 1726882372.90707: in run() - task 0e448fcc-3ce9-8362-0f62-000000000026 11124 1726882372.90726: variable 'ansible_search_path' from source: unknown 11124 1726882372.90744: variable 'ansible_search_path' from source: unknown 11124 1726882372.90791: calling self._execute() 11124 1726882372.90881: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.90893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.90907: variable 'omit' from source: magic vars 11124 1726882372.91642: variable 'ansible_distribution_major_version' from source: facts 11124 1726882372.91665: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882372.91676: _execute() done 11124 1726882372.91683: dumping result to json 11124 1726882372.91689: done dumping result, returning 11124 1726882372.91699: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-8362-0f62-000000000026] 11124 1726882372.91714: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000026 11124 1726882372.91853: no more pending results, returning what we have 11124 1726882372.91859: in VariableManager get_vars() 11124 1726882372.91907: Calling all_inventory to load vars for managed_node1 11124 1726882372.91910: Calling groups_inventory to load vars for managed_node1 11124 1726882372.91913: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.91929: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.91932: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.91936: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.92382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.92697: done with get_vars() 11124 1726882372.92703: variable 'ansible_search_path' from source: unknown 11124 1726882372.92704: variable 'ansible_search_path' from source: unknown 11124 1726882372.92832: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000026 11124 1726882372.92835: WORKER PROCESS EXITING 11124 1726882372.92868: we have included files to process 11124 1726882372.92869: generating all_blocks data 11124 1726882372.92871: done generating all_blocks data 11124 1726882372.92874: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11124 1726882372.92875: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11124 1726882372.92878: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11124 1726882372.94001: done processing included file 11124 1726882372.94003: iterating over new_blocks loaded from include file 11124 1726882372.94004: in VariableManager get_vars() 11124 1726882372.94028: done with get_vars() 11124 1726882372.94030: filtering new block on tags 11124 1726882372.94166: done filtering new block on tags 11124 1726882372.94170: in VariableManager get_vars() 11124 1726882372.94193: done with get_vars() 11124 1726882372.94195: filtering new block on tags 11124 1726882372.94215: done filtering new block on tags 11124 1726882372.94218: in VariableManager get_vars() 11124 1726882372.94240: done with get_vars() 11124 1726882372.94242: filtering new block on tags 11124 1726882372.94376: done filtering new block on tags 11124 1726882372.94379: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 11124 1726882372.94384: extending task lists for all hosts with included blocks 11124 1726882372.96156: done extending task lists 11124 1726882372.96157: done processing included files 11124 1726882372.96158: results queue empty 11124 1726882372.96159: checking for any_errors_fatal 11124 1726882372.96161: done checking for any_errors_fatal 11124 1726882372.96162: checking for max_fail_percentage 11124 1726882372.96165: done checking for max_fail_percentage 11124 1726882372.96166: checking to see if all hosts have failed and the running result is not ok 11124 1726882372.96167: done checking to see if all hosts have failed 11124 1726882372.96167: getting the remaining hosts for this loop 11124 1726882372.96169: done getting the remaining hosts for this loop 11124 1726882372.96171: getting the next task for host managed_node1 11124 1726882372.96175: done getting next task for host managed_node1 11124 1726882372.96178: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11124 1726882372.96181: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882372.96191: getting variables 11124 1726882372.96192: in VariableManager get_vars() 11124 1726882372.96209: Calling all_inventory to load vars for managed_node1 11124 1726882372.96211: Calling groups_inventory to load vars for managed_node1 11124 1726882372.96213: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882372.96225: Calling all_plugins_play to load vars for managed_node1 11124 1726882372.96228: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882372.96232: Calling groups_plugins_play to load vars for managed_node1 11124 1726882372.96382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882372.96593: done with get_vars() 11124 1726882372.96602: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:32:52 -0400 (0:00:00.070) 0:00:13.209 ****** 11124 1726882372.96682: entering _queue_task() for managed_node1/setup 11124 1726882372.96984: worker is 1 (out of 1 available) 11124 1726882372.96997: exiting _queue_task() for managed_node1/setup 11124 1726882372.97009: done queuing things up, now waiting for results queue to drain 11124 1726882372.97011: waiting for pending results... 11124 1726882372.97284: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11124 1726882372.97447: in run() - task 0e448fcc-3ce9-8362-0f62-000000000189 11124 1726882372.97475: variable 'ansible_search_path' from source: unknown 11124 1726882372.97483: variable 'ansible_search_path' from source: unknown 11124 1726882372.97532: calling self._execute() 11124 1726882372.97633: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882372.97656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882372.97923: variable 'omit' from source: magic vars 11124 1726882372.98321: variable 'ansible_distribution_major_version' from source: facts 11124 1726882372.98459: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882372.99008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882373.02086: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882373.02172: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882373.02226: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882373.02321: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882373.02391: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882373.03075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882373.03115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882373.03282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882373.03417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882373.03438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882373.03494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882373.03527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882373.03551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882373.03589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882373.03604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882373.04116: variable '__network_required_facts' from source: role '' defaults 11124 1726882373.04129: variable 'ansible_facts' from source: unknown 11124 1726882373.04908: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11124 1726882373.04920: when evaluation is False, skipping this task 11124 1726882373.05009: _execute() done 11124 1726882373.05034: dumping result to json 11124 1726882373.05042: done dumping result, returning 11124 1726882373.05053: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-8362-0f62-000000000189] 11124 1726882373.05066: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000189 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882373.05245: no more pending results, returning what we have 11124 1726882373.05250: results queue empty 11124 1726882373.05251: checking for any_errors_fatal 11124 1726882373.05252: done checking for any_errors_fatal 11124 1726882373.05253: checking for max_fail_percentage 11124 1726882373.05257: done checking for max_fail_percentage 11124 1726882373.05258: checking to see if all hosts have failed and the running result is not ok 11124 1726882373.05259: done checking to see if all hosts have failed 11124 1726882373.05260: getting the remaining hosts for this loop 11124 1726882373.05265: done getting the remaining hosts for this loop 11124 1726882373.05269: getting the next task for host managed_node1 11124 1726882373.05284: done getting next task for host managed_node1 11124 1726882373.05290: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11124 1726882373.05297: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882373.05313: getting variables 11124 1726882373.05315: in VariableManager get_vars() 11124 1726882373.05372: Calling all_inventory to load vars for managed_node1 11124 1726882373.05375: Calling groups_inventory to load vars for managed_node1 11124 1726882373.05381: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882373.05395: Calling all_plugins_play to load vars for managed_node1 11124 1726882373.05398: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882373.05404: Calling groups_plugins_play to load vars for managed_node1 11124 1726882373.05856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882373.06587: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000189 11124 1726882373.06590: WORKER PROCESS EXITING 11124 1726882373.06597: done with get_vars() 11124 1726882373.06607: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:32:53 -0400 (0:00:00.101) 0:00:13.311 ****** 11124 1726882373.06904: entering _queue_task() for managed_node1/stat 11124 1726882373.07372: worker is 1 (out of 1 available) 11124 1726882373.07384: exiting _queue_task() for managed_node1/stat 11124 1726882373.07398: done queuing things up, now waiting for results queue to drain 11124 1726882373.07399: waiting for pending results... 11124 1726882373.08115: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 11124 1726882373.08293: in run() - task 0e448fcc-3ce9-8362-0f62-00000000018b 11124 1726882373.08306: variable 'ansible_search_path' from source: unknown 11124 1726882373.08311: variable 'ansible_search_path' from source: unknown 11124 1726882373.08345: calling self._execute() 11124 1726882373.08438: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882373.08444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882373.08453: variable 'omit' from source: magic vars 11124 1726882373.08787: variable 'ansible_distribution_major_version' from source: facts 11124 1726882373.08798: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882373.08959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882373.09260: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882373.09318: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882373.09366: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882373.09416: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882373.09552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882373.09601: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882373.09642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882373.09687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882373.09783: variable '__network_is_ostree' from source: set_fact 11124 1726882373.09819: Evaluated conditional (not __network_is_ostree is defined): False 11124 1726882373.09822: when evaluation is False, skipping this task 11124 1726882373.09824: _execute() done 11124 1726882373.09829: dumping result to json 11124 1726882373.09832: done dumping result, returning 11124 1726882373.09872: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-8362-0f62-00000000018b] 11124 1726882373.09878: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000018b 11124 1726882373.09970: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000018b 11124 1726882373.09972: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11124 1726882373.10070: no more pending results, returning what we have 11124 1726882373.10077: results queue empty 11124 1726882373.10078: checking for any_errors_fatal 11124 1726882373.10088: done checking for any_errors_fatal 11124 1726882373.10089: checking for max_fail_percentage 11124 1726882373.10091: done checking for max_fail_percentage 11124 1726882373.10092: checking to see if all hosts have failed and the running result is not ok 11124 1726882373.10093: done checking to see if all hosts have failed 11124 1726882373.10094: getting the remaining hosts for this loop 11124 1726882373.10095: done getting the remaining hosts for this loop 11124 1726882373.10101: getting the next task for host managed_node1 11124 1726882373.10109: done getting next task for host managed_node1 11124 1726882373.10113: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11124 1726882373.10119: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882373.10136: getting variables 11124 1726882373.10138: in VariableManager get_vars() 11124 1726882373.10191: Calling all_inventory to load vars for managed_node1 11124 1726882373.10197: Calling groups_inventory to load vars for managed_node1 11124 1726882373.10199: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882373.10210: Calling all_plugins_play to load vars for managed_node1 11124 1726882373.10213: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882373.10217: Calling groups_plugins_play to load vars for managed_node1 11124 1726882373.10471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882373.10742: done with get_vars() 11124 1726882373.10755: done getting variables 11124 1726882373.10829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:32:53 -0400 (0:00:00.039) 0:00:13.351 ****** 11124 1726882373.10874: entering _queue_task() for managed_node1/set_fact 11124 1726882373.11262: worker is 1 (out of 1 available) 11124 1726882373.11277: exiting _queue_task() for managed_node1/set_fact 11124 1726882373.11292: done queuing things up, now waiting for results queue to drain 11124 1726882373.11293: waiting for pending results... 11124 1726882373.11610: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11124 1726882373.11896: in run() - task 0e448fcc-3ce9-8362-0f62-00000000018c 11124 1726882373.11916: variable 'ansible_search_path' from source: unknown 11124 1726882373.11953: variable 'ansible_search_path' from source: unknown 11124 1726882373.12097: calling self._execute() 11124 1726882373.12301: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882373.12312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882373.12325: variable 'omit' from source: magic vars 11124 1726882373.13192: variable 'ansible_distribution_major_version' from source: facts 11124 1726882373.13227: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882373.13567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882373.13959: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882373.14007: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882373.14051: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882373.14089: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882373.14191: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882373.14225: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882373.14272: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882373.14311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882373.14426: variable '__network_is_ostree' from source: set_fact 11124 1726882373.14439: Evaluated conditional (not __network_is_ostree is defined): False 11124 1726882373.14454: when evaluation is False, skipping this task 11124 1726882373.14467: _execute() done 11124 1726882373.14478: dumping result to json 11124 1726882373.14484: done dumping result, returning 11124 1726882373.14504: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-8362-0f62-00000000018c] 11124 1726882373.14522: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000018c skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11124 1726882373.14693: no more pending results, returning what we have 11124 1726882373.14698: results queue empty 11124 1726882373.14699: checking for any_errors_fatal 11124 1726882373.14704: done checking for any_errors_fatal 11124 1726882373.14705: checking for max_fail_percentage 11124 1726882373.14707: done checking for max_fail_percentage 11124 1726882373.14708: checking to see if all hosts have failed and the running result is not ok 11124 1726882373.14709: done checking to see if all hosts have failed 11124 1726882373.14710: getting the remaining hosts for this loop 11124 1726882373.14712: done getting the remaining hosts for this loop 11124 1726882373.14715: getting the next task for host managed_node1 11124 1726882373.14725: done getting next task for host managed_node1 11124 1726882373.14729: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11124 1726882373.14734: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882373.14748: getting variables 11124 1726882373.14753: in VariableManager get_vars() 11124 1726882373.14804: Calling all_inventory to load vars for managed_node1 11124 1726882373.14807: Calling groups_inventory to load vars for managed_node1 11124 1726882373.14810: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882373.14821: Calling all_plugins_play to load vars for managed_node1 11124 1726882373.14825: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882373.14828: Calling groups_plugins_play to load vars for managed_node1 11124 1726882373.15063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882373.15436: done with get_vars() 11124 1726882373.15447: done getting variables 11124 1726882373.15480: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000018c 11124 1726882373.15483: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:32:53 -0400 (0:00:00.061) 0:00:13.412 ****** 11124 1726882373.17020: entering _queue_task() for managed_node1/service_facts 11124 1726882373.17022: Creating lock for service_facts 11124 1726882373.17654: worker is 1 (out of 1 available) 11124 1726882373.17874: exiting _queue_task() for managed_node1/service_facts 11124 1726882373.17892: done queuing things up, now waiting for results queue to drain 11124 1726882373.17894: waiting for pending results... 11124 1726882373.18706: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 11124 1726882373.18969: in run() - task 0e448fcc-3ce9-8362-0f62-00000000018e 11124 1726882373.19074: variable 'ansible_search_path' from source: unknown 11124 1726882373.19084: variable 'ansible_search_path' from source: unknown 11124 1726882373.19136: calling self._execute() 11124 1726882373.19296: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882373.19310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882373.19440: variable 'omit' from source: magic vars 11124 1726882373.20146: variable 'ansible_distribution_major_version' from source: facts 11124 1726882373.20307: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882373.20321: variable 'omit' from source: magic vars 11124 1726882373.20519: variable 'omit' from source: magic vars 11124 1726882373.20559: variable 'omit' from source: magic vars 11124 1726882373.20609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882373.20659: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882373.20755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882373.20785: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882373.20854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882373.20889: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882373.20955: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882373.20964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882373.21182: Set connection var ansible_shell_executable to /bin/sh 11124 1726882373.21197: Set connection var ansible_shell_type to sh 11124 1726882373.21209: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882373.21218: Set connection var ansible_timeout to 10 11124 1726882373.21235: Set connection var ansible_pipelining to False 11124 1726882373.21282: Set connection var ansible_connection to ssh 11124 1726882373.21308: variable 'ansible_shell_executable' from source: unknown 11124 1726882373.21376: variable 'ansible_connection' from source: unknown 11124 1726882373.21386: variable 'ansible_module_compression' from source: unknown 11124 1726882373.21393: variable 'ansible_shell_type' from source: unknown 11124 1726882373.21400: variable 'ansible_shell_executable' from source: unknown 11124 1726882373.21406: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882373.21418: variable 'ansible_pipelining' from source: unknown 11124 1726882373.21429: variable 'ansible_timeout' from source: unknown 11124 1726882373.21438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882373.21866: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882373.21942: variable 'omit' from source: magic vars 11124 1726882373.21981: starting attempt loop 11124 1726882373.21990: running the handler 11124 1726882373.22010: _low_level_execute_command(): starting 11124 1726882373.22043: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882373.24810: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882373.24832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882373.24850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882373.24876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882373.24922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882373.24935: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882373.24951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.24973: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882373.24996: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882373.25009: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882373.25022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882373.25035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882373.25054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882373.25069: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882373.25081: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882373.25098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.25181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882373.25212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882373.25238: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882373.25374: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882373.27038: stdout chunk (state=3): >>>/root <<< 11124 1726882373.27187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882373.27240: stderr chunk (state=3): >>><<< 11124 1726882373.27243: stdout chunk (state=3): >>><<< 11124 1726882373.27372: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882373.27376: _low_level_execute_command(): starting 11124 1726882373.27379: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599 `" && echo ansible-tmp-1726882373.2726593-11799-24793588753599="` echo /root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599 `" ) && sleep 0' 11124 1726882373.29070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882373.29087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882373.29102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882373.29125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882373.29172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882373.29233: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882373.29248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.29271: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882373.29284: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882373.29294: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882373.29306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882373.29318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882373.29338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882373.29353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882373.29367: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882373.29381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.29568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882373.29585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882373.29598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882373.29783: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882373.31682: stdout chunk (state=3): >>>ansible-tmp-1726882373.2726593-11799-24793588753599=/root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599 <<< 11124 1726882373.31893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882373.31899: stdout chunk (state=3): >>><<< 11124 1726882373.31902: stderr chunk (state=3): >>><<< 11124 1726882373.32077: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882373.2726593-11799-24793588753599=/root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882373.32083: variable 'ansible_module_compression' from source: unknown 11124 1726882373.32085: ANSIBALLZ: Using lock for service_facts 11124 1726882373.32088: ANSIBALLZ: Acquiring lock 11124 1726882373.32092: ANSIBALLZ: Lock acquired: 139628947554528 11124 1726882373.32094: ANSIBALLZ: Creating module 11124 1726882373.56719: ANSIBALLZ: Writing module into payload 11124 1726882373.56835: ANSIBALLZ: Writing module 11124 1726882373.56868: ANSIBALLZ: Renaming module 11124 1726882373.56875: ANSIBALLZ: Done creating module 11124 1726882373.56893: variable 'ansible_facts' from source: unknown 11124 1726882373.56972: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599/AnsiballZ_service_facts.py 11124 1726882373.57686: Sending initial data 11124 1726882373.57689: Sent initial data (161 bytes) 11124 1726882373.60000: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882373.60139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882373.60149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882373.60168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882373.60207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882373.60267: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882373.60276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.60289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882373.60297: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882373.60304: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882373.60348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882373.60362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882373.60381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882373.60387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882373.60396: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882373.60403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.60470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882373.60573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882373.60587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882373.60712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882373.62594: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882373.62694: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882373.62796: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp8iwf6n05 /root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599/AnsiballZ_service_facts.py <<< 11124 1726882373.62884: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882373.64362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882373.64627: stderr chunk (state=3): >>><<< 11124 1726882373.64631: stdout chunk (state=3): >>><<< 11124 1726882373.64633: done transferring module to remote 11124 1726882373.64635: _low_level_execute_command(): starting 11124 1726882373.64637: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599/ /root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599/AnsiballZ_service_facts.py && sleep 0' 11124 1726882373.66056: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882373.66060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882373.66211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.66215: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882373.66217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.66408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882373.66411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882373.66418: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882373.66519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882373.68328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882373.68410: stderr chunk (state=3): >>><<< 11124 1726882373.68413: stdout chunk (state=3): >>><<< 11124 1726882373.68511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882373.68515: _low_level_execute_command(): starting 11124 1726882373.68518: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599/AnsiballZ_service_facts.py && sleep 0' 11124 1726882373.70327: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882373.70389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882373.70398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882373.70409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882373.70521: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882373.70530: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882373.70542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.70550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882373.70572: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882373.70577: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882373.70580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882373.70582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882373.70606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882373.70609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882373.70611: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882373.70616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882373.70692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882373.70761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882373.70785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882373.70960: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882375.05196: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 11124 1726882375.05213: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "s<<< 11124 1726882375.05218: stdout chunk (state=3): >>>tate": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static<<< 11124 1726882375.05233: stdout chunk (state=3): >>>", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"n<<< 11124 1726882375.05241: stdout chunk (state=3): >>>ame": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", <<< 11124 1726882375.05271: stdout chunk (state=3): >>>"status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "sys<<< 11124 1726882375.05275: stdout chunk (state=3): >>>temd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11124 1726882375.06542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882375.06611: stderr chunk (state=3): >>><<< 11124 1726882375.06614: stdout chunk (state=3): >>><<< 11124 1726882375.06634: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882375.07000: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882375.07007: _low_level_execute_command(): starting 11124 1726882375.07012: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882373.2726593-11799-24793588753599/ > /dev/null 2>&1 && sleep 0' 11124 1726882375.07475: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.07480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.07512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.07526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.07537: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.07584: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882375.07596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882375.07699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882375.09493: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882375.09536: stderr chunk (state=3): >>><<< 11124 1726882375.09539: stdout chunk (state=3): >>><<< 11124 1726882375.09553: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882375.09556: handler run complete 11124 1726882375.09660: variable 'ansible_facts' from source: unknown 11124 1726882375.09742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882375.10379: variable 'ansible_facts' from source: unknown 11124 1726882375.10512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882375.10721: attempt loop complete, returning result 11124 1726882375.10733: _execute() done 11124 1726882375.10740: dumping result to json 11124 1726882375.10810: done dumping result, returning 11124 1726882375.10824: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-8362-0f62-00000000018e] 11124 1726882375.10832: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000018e 11124 1726882375.11923: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000018e 11124 1726882375.11925: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882375.11979: no more pending results, returning what we have 11124 1726882375.11981: results queue empty 11124 1726882375.11982: checking for any_errors_fatal 11124 1726882375.11985: done checking for any_errors_fatal 11124 1726882375.11986: checking for max_fail_percentage 11124 1726882375.11987: done checking for max_fail_percentage 11124 1726882375.11988: checking to see if all hosts have failed and the running result is not ok 11124 1726882375.11989: done checking to see if all hosts have failed 11124 1726882375.11989: getting the remaining hosts for this loop 11124 1726882375.11990: done getting the remaining hosts for this loop 11124 1726882375.11994: getting the next task for host managed_node1 11124 1726882375.11999: done getting next task for host managed_node1 11124 1726882375.12002: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11124 1726882375.12006: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882375.12015: getting variables 11124 1726882375.12016: in VariableManager get_vars() 11124 1726882375.12047: Calling all_inventory to load vars for managed_node1 11124 1726882375.12052: Calling groups_inventory to load vars for managed_node1 11124 1726882375.12054: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882375.12065: Calling all_plugins_play to load vars for managed_node1 11124 1726882375.12067: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882375.12072: Calling groups_plugins_play to load vars for managed_node1 11124 1726882375.12392: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882375.12665: done with get_vars() 11124 1726882375.12674: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:32:55 -0400 (0:00:01.957) 0:00:15.369 ****** 11124 1726882375.12740: entering _queue_task() for managed_node1/package_facts 11124 1726882375.12744: Creating lock for package_facts 11124 1726882375.12937: worker is 1 (out of 1 available) 11124 1726882375.12952: exiting _queue_task() for managed_node1/package_facts 11124 1726882375.12967: done queuing things up, now waiting for results queue to drain 11124 1726882375.12968: waiting for pending results... 11124 1726882375.13119: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 11124 1726882375.13205: in run() - task 0e448fcc-3ce9-8362-0f62-00000000018f 11124 1726882375.13216: variable 'ansible_search_path' from source: unknown 11124 1726882375.13219: variable 'ansible_search_path' from source: unknown 11124 1726882375.13247: calling self._execute() 11124 1726882375.13310: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882375.13325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882375.13344: variable 'omit' from source: magic vars 11124 1726882375.13717: variable 'ansible_distribution_major_version' from source: facts 11124 1726882375.13733: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882375.13744: variable 'omit' from source: magic vars 11124 1726882375.13828: variable 'omit' from source: magic vars 11124 1726882375.13869: variable 'omit' from source: magic vars 11124 1726882375.13915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882375.13954: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882375.13981: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882375.14008: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882375.14024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882375.14059: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882375.14070: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882375.14079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882375.14188: Set connection var ansible_shell_executable to /bin/sh 11124 1726882375.14201: Set connection var ansible_shell_type to sh 11124 1726882375.14212: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882375.14227: Set connection var ansible_timeout to 10 11124 1726882375.14237: Set connection var ansible_pipelining to False 11124 1726882375.14243: Set connection var ansible_connection to ssh 11124 1726882375.14271: variable 'ansible_shell_executable' from source: unknown 11124 1726882375.14280: variable 'ansible_connection' from source: unknown 11124 1726882375.14288: variable 'ansible_module_compression' from source: unknown 11124 1726882375.14294: variable 'ansible_shell_type' from source: unknown 11124 1726882375.14300: variable 'ansible_shell_executable' from source: unknown 11124 1726882375.14306: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882375.14313: variable 'ansible_pipelining' from source: unknown 11124 1726882375.14318: variable 'ansible_timeout' from source: unknown 11124 1726882375.14325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882375.14524: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882375.14540: variable 'omit' from source: magic vars 11124 1726882375.14557: starting attempt loop 11124 1726882375.14565: running the handler 11124 1726882375.14583: _low_level_execute_command(): starting 11124 1726882375.14595: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882375.15353: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882375.15371: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.15387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.15405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.15446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.15460: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882375.15474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.15489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882375.15499: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882375.15507: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882375.15520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.15532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.15545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.15558: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.15572: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882375.15587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.15671: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882375.15688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882375.15702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882375.15874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882375.17484: stdout chunk (state=3): >>>/root <<< 11124 1726882375.17589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882375.17669: stderr chunk (state=3): >>><<< 11124 1726882375.17672: stdout chunk (state=3): >>><<< 11124 1726882375.17771: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882375.17774: _low_level_execute_command(): starting 11124 1726882375.17777: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114 `" && echo ansible-tmp-1726882375.1768987-11913-58987419967114="` echo /root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114 `" ) && sleep 0' 11124 1726882375.19335: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882375.19350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.19369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.19389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.19508: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.19521: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882375.19535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.19554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882375.19569: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882375.19581: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882375.19594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.19609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.19625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.19637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.19648: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882375.19665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.19742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882375.19890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882375.19900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882375.20023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882375.21894: stdout chunk (state=3): >>>ansible-tmp-1726882375.1768987-11913-58987419967114=/root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114 <<< 11124 1726882375.22085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882375.22088: stdout chunk (state=3): >>><<< 11124 1726882375.22090: stderr chunk (state=3): >>><<< 11124 1726882375.22170: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882375.1768987-11913-58987419967114=/root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882375.22369: variable 'ansible_module_compression' from source: unknown 11124 1726882375.22372: ANSIBALLZ: Using lock for package_facts 11124 1726882375.22375: ANSIBALLZ: Acquiring lock 11124 1726882375.22377: ANSIBALLZ: Lock acquired: 139628944981840 11124 1726882375.22379: ANSIBALLZ: Creating module 11124 1726882375.56912: ANSIBALLZ: Writing module into payload 11124 1726882375.57100: ANSIBALLZ: Writing module 11124 1726882375.57139: ANSIBALLZ: Renaming module 11124 1726882375.57154: ANSIBALLZ: Done creating module 11124 1726882375.57204: variable 'ansible_facts' from source: unknown 11124 1726882375.57488: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114/AnsiballZ_package_facts.py 11124 1726882375.58604: Sending initial data 11124 1726882375.58607: Sent initial data (161 bytes) 11124 1726882375.59909: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882375.59927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.59944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.59969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.60010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.60027: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882375.60042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.60063: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882375.60078: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882375.60089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882375.60101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.60115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.60136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.60151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.60165: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882375.60179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.60262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882375.60288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882375.60306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882375.60437: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882375.62282: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882375.62372: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882375.62479: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmppb8_k8nu /root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114/AnsiballZ_package_facts.py <<< 11124 1726882375.62571: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882375.65834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882375.65968: stderr chunk (state=3): >>><<< 11124 1726882375.65972: stdout chunk (state=3): >>><<< 11124 1726882375.65975: done transferring module to remote 11124 1726882375.65977: _low_level_execute_command(): starting 11124 1726882375.65979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114/ /root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114/AnsiballZ_package_facts.py && sleep 0' 11124 1726882375.66557: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882375.66574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.66587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.66603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.66645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.66660: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882375.66675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.66690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882375.66700: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882375.66709: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882375.66722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.66733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.66746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.66758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.66770: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882375.66781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.66860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882375.66881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882375.66894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882375.67012: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882375.68793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882375.68868: stderr chunk (state=3): >>><<< 11124 1726882375.68879: stdout chunk (state=3): >>><<< 11124 1726882375.68969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882375.68973: _low_level_execute_command(): starting 11124 1726882375.68976: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114/AnsiballZ_package_facts.py && sleep 0' 11124 1726882375.69573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882375.69586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.69602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.69620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.69677: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.69690: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882375.69704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.69720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882375.69732: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882375.69742: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882375.69762: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882375.69779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882375.69794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882375.69806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882375.69817: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882375.69830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882375.69914: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882375.69934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882375.69951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882375.70082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882376.16039: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 11124 1726882376.16057: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": nu<<< 11124 1726882376.16072: stdout chunk (state=3): >>>ll, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 11124 1726882376.16100: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 11124 1726882376.16115: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 11124 1726882376.16143: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 11124 1726882376.16159: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 11124 1726882376.16167: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 11124 1726882376.16174: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-<<< 11124 1726882376.16180: stdout chunk (state=3): >>>base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 11124 1726882376.16182: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 11124 1726882376.16209: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, <<< 11124 1726882376.16213: stdout chunk (state=3): >>>"arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300<<< 11124 1726882376.16217: stdout chunk (state=3): >>>", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64"<<< 11124 1726882376.16247: stdout chunk (state=3): >>>, "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_6<<< 11124 1726882376.16261: stdout chunk (state=3): >>>4", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", <<< 11124 1726882376.16282: stdout chunk (state=3): >>>"release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 11124 1726882376.16287: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11124 1726882376.17723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882376.17776: stderr chunk (state=3): >>><<< 11124 1726882376.17781: stdout chunk (state=3): >>><<< 11124 1726882376.17813: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882376.20266: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882376.20288: _low_level_execute_command(): starting 11124 1726882376.20293: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882375.1768987-11913-58987419967114/ > /dev/null 2>&1 && sleep 0' 11124 1726882376.20930: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882376.20939: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882376.20951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882376.20972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882376.21006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882376.21013: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882376.21023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882376.21037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882376.21044: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882376.21050: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882376.21062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882376.21074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882376.21085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882376.21093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882376.21099: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882376.21109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882376.21185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882376.21202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882376.21214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882376.21335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882376.23170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882376.23243: stderr chunk (state=3): >>><<< 11124 1726882376.23249: stdout chunk (state=3): >>><<< 11124 1726882376.23279: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882376.23285: handler run complete 11124 1726882376.24347: variable 'ansible_facts' from source: unknown 11124 1726882376.24887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.27295: variable 'ansible_facts' from source: unknown 11124 1726882376.27822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.28717: attempt loop complete, returning result 11124 1726882376.28730: _execute() done 11124 1726882376.28734: dumping result to json 11124 1726882376.28991: done dumping result, returning 11124 1726882376.29012: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-8362-0f62-00000000018f] 11124 1726882376.29017: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000018f 11124 1726882376.31513: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000018f 11124 1726882376.31516: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882376.31615: no more pending results, returning what we have 11124 1726882376.31618: results queue empty 11124 1726882376.31619: checking for any_errors_fatal 11124 1726882376.31622: done checking for any_errors_fatal 11124 1726882376.31623: checking for max_fail_percentage 11124 1726882376.31625: done checking for max_fail_percentage 11124 1726882376.31626: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.31627: done checking to see if all hosts have failed 11124 1726882376.31628: getting the remaining hosts for this loop 11124 1726882376.31629: done getting the remaining hosts for this loop 11124 1726882376.31633: getting the next task for host managed_node1 11124 1726882376.31639: done getting next task for host managed_node1 11124 1726882376.31643: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11124 1726882376.31647: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.31659: getting variables 11124 1726882376.31660: in VariableManager get_vars() 11124 1726882376.31692: Calling all_inventory to load vars for managed_node1 11124 1726882376.31695: Calling groups_inventory to load vars for managed_node1 11124 1726882376.31697: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.31706: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.31708: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.31711: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.32672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.33605: done with get_vars() 11124 1726882376.33624: done getting variables 11124 1726882376.33670: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:32:56 -0400 (0:00:01.209) 0:00:16.579 ****** 11124 1726882376.33693: entering _queue_task() for managed_node1/debug 11124 1726882376.33908: worker is 1 (out of 1 available) 11124 1726882376.33932: exiting _queue_task() for managed_node1/debug 11124 1726882376.33945: done queuing things up, now waiting for results queue to drain 11124 1726882376.33947: waiting for pending results... 11124 1726882376.34145: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 11124 1726882376.34280: in run() - task 0e448fcc-3ce9-8362-0f62-000000000027 11124 1726882376.34303: variable 'ansible_search_path' from source: unknown 11124 1726882376.34313: variable 'ansible_search_path' from source: unknown 11124 1726882376.34356: calling self._execute() 11124 1726882376.34444: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.34458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.34474: variable 'omit' from source: magic vars 11124 1726882376.34841: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.34857: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.34871: variable 'omit' from source: magic vars 11124 1726882376.34928: variable 'omit' from source: magic vars 11124 1726882376.35030: variable 'network_provider' from source: set_fact 11124 1726882376.35053: variable 'omit' from source: magic vars 11124 1726882376.35102: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882376.35144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882376.35169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882376.35190: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882376.35217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882376.35256: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882376.35260: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.35262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.35346: Set connection var ansible_shell_executable to /bin/sh 11124 1726882376.35360: Set connection var ansible_shell_type to sh 11124 1726882376.35369: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882376.35374: Set connection var ansible_timeout to 10 11124 1726882376.35379: Set connection var ansible_pipelining to False 11124 1726882376.35382: Set connection var ansible_connection to ssh 11124 1726882376.35398: variable 'ansible_shell_executable' from source: unknown 11124 1726882376.35401: variable 'ansible_connection' from source: unknown 11124 1726882376.35404: variable 'ansible_module_compression' from source: unknown 11124 1726882376.35407: variable 'ansible_shell_type' from source: unknown 11124 1726882376.35409: variable 'ansible_shell_executable' from source: unknown 11124 1726882376.35411: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.35413: variable 'ansible_pipelining' from source: unknown 11124 1726882376.35416: variable 'ansible_timeout' from source: unknown 11124 1726882376.35421: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.35544: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882376.35557: variable 'omit' from source: magic vars 11124 1726882376.35560: starting attempt loop 11124 1726882376.35564: running the handler 11124 1726882376.35603: handler run complete 11124 1726882376.35614: attempt loop complete, returning result 11124 1726882376.35617: _execute() done 11124 1726882376.35623: dumping result to json 11124 1726882376.35626: done dumping result, returning 11124 1726882376.35633: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-8362-0f62-000000000027] 11124 1726882376.35638: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000027 11124 1726882376.35720: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000027 11124 1726882376.35723: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 11124 1726882376.35818: no more pending results, returning what we have 11124 1726882376.35821: results queue empty 11124 1726882376.35822: checking for any_errors_fatal 11124 1726882376.35827: done checking for any_errors_fatal 11124 1726882376.35828: checking for max_fail_percentage 11124 1726882376.35829: done checking for max_fail_percentage 11124 1726882376.35830: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.35833: done checking to see if all hosts have failed 11124 1726882376.35833: getting the remaining hosts for this loop 11124 1726882376.35834: done getting the remaining hosts for this loop 11124 1726882376.35838: getting the next task for host managed_node1 11124 1726882376.35843: done getting next task for host managed_node1 11124 1726882376.35846: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11124 1726882376.35849: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.35858: getting variables 11124 1726882376.35860: in VariableManager get_vars() 11124 1726882376.35895: Calling all_inventory to load vars for managed_node1 11124 1726882376.35897: Calling groups_inventory to load vars for managed_node1 11124 1726882376.35899: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.35906: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.35908: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.35909: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.36756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.37696: done with get_vars() 11124 1726882376.37712: done getting variables 11124 1726882376.37780: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:32:56 -0400 (0:00:00.041) 0:00:16.620 ****** 11124 1726882376.37802: entering _queue_task() for managed_node1/fail 11124 1726882376.37804: Creating lock for fail 11124 1726882376.38018: worker is 1 (out of 1 available) 11124 1726882376.38031: exiting _queue_task() for managed_node1/fail 11124 1726882376.38043: done queuing things up, now waiting for results queue to drain 11124 1726882376.38045: waiting for pending results... 11124 1726882376.38208: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11124 1726882376.38293: in run() - task 0e448fcc-3ce9-8362-0f62-000000000028 11124 1726882376.38305: variable 'ansible_search_path' from source: unknown 11124 1726882376.38308: variable 'ansible_search_path' from source: unknown 11124 1726882376.38337: calling self._execute() 11124 1726882376.38404: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.38407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.38416: variable 'omit' from source: magic vars 11124 1726882376.38679: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.38689: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.38773: variable 'network_state' from source: role '' defaults 11124 1726882376.38780: Evaluated conditional (network_state != {}): False 11124 1726882376.38784: when evaluation is False, skipping this task 11124 1726882376.38786: _execute() done 11124 1726882376.38789: dumping result to json 11124 1726882376.38791: done dumping result, returning 11124 1726882376.38798: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-8362-0f62-000000000028] 11124 1726882376.38803: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000028 11124 1726882376.38888: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000028 11124 1726882376.38891: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882376.38963: no more pending results, returning what we have 11124 1726882376.38968: results queue empty 11124 1726882376.38969: checking for any_errors_fatal 11124 1726882376.38973: done checking for any_errors_fatal 11124 1726882376.38974: checking for max_fail_percentage 11124 1726882376.38975: done checking for max_fail_percentage 11124 1726882376.38976: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.38977: done checking to see if all hosts have failed 11124 1726882376.38977: getting the remaining hosts for this loop 11124 1726882376.38978: done getting the remaining hosts for this loop 11124 1726882376.38981: getting the next task for host managed_node1 11124 1726882376.38987: done getting next task for host managed_node1 11124 1726882376.38990: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11124 1726882376.38993: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.39005: getting variables 11124 1726882376.39007: in VariableManager get_vars() 11124 1726882376.39041: Calling all_inventory to load vars for managed_node1 11124 1726882376.39043: Calling groups_inventory to load vars for managed_node1 11124 1726882376.39045: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.39054: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.39056: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.39058: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.40122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.41626: done with get_vars() 11124 1726882376.41644: done getting variables 11124 1726882376.41691: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:32:56 -0400 (0:00:00.039) 0:00:16.659 ****** 11124 1726882376.41715: entering _queue_task() for managed_node1/fail 11124 1726882376.41931: worker is 1 (out of 1 available) 11124 1726882376.41945: exiting _queue_task() for managed_node1/fail 11124 1726882376.41960: done queuing things up, now waiting for results queue to drain 11124 1726882376.41962: waiting for pending results... 11124 1726882376.42127: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11124 1726882376.42207: in run() - task 0e448fcc-3ce9-8362-0f62-000000000029 11124 1726882376.42219: variable 'ansible_search_path' from source: unknown 11124 1726882376.42221: variable 'ansible_search_path' from source: unknown 11124 1726882376.42251: calling self._execute() 11124 1726882376.42320: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.42324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.42331: variable 'omit' from source: magic vars 11124 1726882376.42594: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.42603: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.42688: variable 'network_state' from source: role '' defaults 11124 1726882376.42695: Evaluated conditional (network_state != {}): False 11124 1726882376.42699: when evaluation is False, skipping this task 11124 1726882376.42702: _execute() done 11124 1726882376.42704: dumping result to json 11124 1726882376.42707: done dumping result, returning 11124 1726882376.42714: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-8362-0f62-000000000029] 11124 1726882376.42718: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000029 11124 1726882376.42804: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000029 11124 1726882376.42806: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882376.42884: no more pending results, returning what we have 11124 1726882376.42888: results queue empty 11124 1726882376.42889: checking for any_errors_fatal 11124 1726882376.42893: done checking for any_errors_fatal 11124 1726882376.42894: checking for max_fail_percentage 11124 1726882376.42895: done checking for max_fail_percentage 11124 1726882376.42896: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.42897: done checking to see if all hosts have failed 11124 1726882376.42898: getting the remaining hosts for this loop 11124 1726882376.42899: done getting the remaining hosts for this loop 11124 1726882376.42902: getting the next task for host managed_node1 11124 1726882376.42907: done getting next task for host managed_node1 11124 1726882376.42910: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11124 1726882376.42913: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.42926: getting variables 11124 1726882376.42958: in VariableManager get_vars() 11124 1726882376.42988: Calling all_inventory to load vars for managed_node1 11124 1726882376.42990: Calling groups_inventory to load vars for managed_node1 11124 1726882376.42991: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.42998: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.43000: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.43001: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.44209: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.45660: done with get_vars() 11124 1726882376.45678: done getting variables 11124 1726882376.45720: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:32:56 -0400 (0:00:00.040) 0:00:16.699 ****** 11124 1726882376.45744: entering _queue_task() for managed_node1/fail 11124 1726882376.45959: worker is 1 (out of 1 available) 11124 1726882376.45974: exiting _queue_task() for managed_node1/fail 11124 1726882376.45987: done queuing things up, now waiting for results queue to drain 11124 1726882376.45989: waiting for pending results... 11124 1726882376.46153: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11124 1726882376.46232: in run() - task 0e448fcc-3ce9-8362-0f62-00000000002a 11124 1726882376.46243: variable 'ansible_search_path' from source: unknown 11124 1726882376.46247: variable 'ansible_search_path' from source: unknown 11124 1726882376.46279: calling self._execute() 11124 1726882376.46348: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.46354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.46362: variable 'omit' from source: magic vars 11124 1726882376.46624: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.46634: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.46757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882376.48731: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882376.48780: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882376.48806: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882376.48830: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882376.48850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882376.48911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.48930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.48947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.48981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.48992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.49057: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.49077: Evaluated conditional (ansible_distribution_major_version | int > 9): False 11124 1726882376.49082: when evaluation is False, skipping this task 11124 1726882376.49085: _execute() done 11124 1726882376.49087: dumping result to json 11124 1726882376.49089: done dumping result, returning 11124 1726882376.49095: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-8362-0f62-00000000002a] 11124 1726882376.49102: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002a 11124 1726882376.49187: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002a 11124 1726882376.49190: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 11124 1726882376.49231: no more pending results, returning what we have 11124 1726882376.49234: results queue empty 11124 1726882376.49236: checking for any_errors_fatal 11124 1726882376.49240: done checking for any_errors_fatal 11124 1726882376.49241: checking for max_fail_percentage 11124 1726882376.49242: done checking for max_fail_percentage 11124 1726882376.49243: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.49244: done checking to see if all hosts have failed 11124 1726882376.49245: getting the remaining hosts for this loop 11124 1726882376.49246: done getting the remaining hosts for this loop 11124 1726882376.49252: getting the next task for host managed_node1 11124 1726882376.49258: done getting next task for host managed_node1 11124 1726882376.49262: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11124 1726882376.49267: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.49280: getting variables 11124 1726882376.49282: in VariableManager get_vars() 11124 1726882376.49320: Calling all_inventory to load vars for managed_node1 11124 1726882376.49322: Calling groups_inventory to load vars for managed_node1 11124 1726882376.49325: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.49333: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.49335: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.49337: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.50937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.52517: done with get_vars() 11124 1726882376.52534: done getting variables 11124 1726882376.52609: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:32:56 -0400 (0:00:00.068) 0:00:16.768 ****** 11124 1726882376.52631: entering _queue_task() for managed_node1/dnf 11124 1726882376.52848: worker is 1 (out of 1 available) 11124 1726882376.52860: exiting _queue_task() for managed_node1/dnf 11124 1726882376.52874: done queuing things up, now waiting for results queue to drain 11124 1726882376.52876: waiting for pending results... 11124 1726882376.53035: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11124 1726882376.53120: in run() - task 0e448fcc-3ce9-8362-0f62-00000000002b 11124 1726882376.53135: variable 'ansible_search_path' from source: unknown 11124 1726882376.53138: variable 'ansible_search_path' from source: unknown 11124 1726882376.53171: calling self._execute() 11124 1726882376.53235: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.53239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.53247: variable 'omit' from source: magic vars 11124 1726882376.53501: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.53510: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.53641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882376.55183: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882376.55227: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882376.55255: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882376.55282: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882376.55303: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882376.55357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.55378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.55398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.55426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.55437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.55516: variable 'ansible_distribution' from source: facts 11124 1726882376.55519: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.55531: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11124 1726882376.55602: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882376.55686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.55702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.55720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.55747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.55758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.55787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.55802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.55818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.55846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.55857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.55885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.55901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.55917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.55947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.55956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.56058: variable 'network_connections' from source: task vars 11124 1726882376.56061: variable 'controller_profile' from source: play vars 11124 1726882376.56106: variable 'controller_profile' from source: play vars 11124 1726882376.56113: variable 'controller_device' from source: play vars 11124 1726882376.56155: variable 'controller_device' from source: play vars 11124 1726882376.56164: variable 'port1_profile' from source: play vars 11124 1726882376.56206: variable 'port1_profile' from source: play vars 11124 1726882376.56212: variable 'dhcp_interface1' from source: play vars 11124 1726882376.56254: variable 'dhcp_interface1' from source: play vars 11124 1726882376.56263: variable 'controller_profile' from source: play vars 11124 1726882376.56305: variable 'controller_profile' from source: play vars 11124 1726882376.56311: variable 'port2_profile' from source: play vars 11124 1726882376.56355: variable 'port2_profile' from source: play vars 11124 1726882376.56359: variable 'dhcp_interface2' from source: play vars 11124 1726882376.56402: variable 'dhcp_interface2' from source: play vars 11124 1726882376.56408: variable 'controller_profile' from source: play vars 11124 1726882376.56451: variable 'controller_profile' from source: play vars 11124 1726882376.56512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882376.56618: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882376.56644: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882376.56670: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882376.56692: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882376.56724: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882376.56738: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882376.56756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.56775: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882376.56821: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882376.56969: variable 'network_connections' from source: task vars 11124 1726882376.56973: variable 'controller_profile' from source: play vars 11124 1726882376.57016: variable 'controller_profile' from source: play vars 11124 1726882376.57024: variable 'controller_device' from source: play vars 11124 1726882376.57066: variable 'controller_device' from source: play vars 11124 1726882376.57075: variable 'port1_profile' from source: play vars 11124 1726882376.57118: variable 'port1_profile' from source: play vars 11124 1726882376.57121: variable 'dhcp_interface1' from source: play vars 11124 1726882376.57168: variable 'dhcp_interface1' from source: play vars 11124 1726882376.57173: variable 'controller_profile' from source: play vars 11124 1726882376.57214: variable 'controller_profile' from source: play vars 11124 1726882376.57219: variable 'port2_profile' from source: play vars 11124 1726882376.57266: variable 'port2_profile' from source: play vars 11124 1726882376.57271: variable 'dhcp_interface2' from source: play vars 11124 1726882376.57312: variable 'dhcp_interface2' from source: play vars 11124 1726882376.57317: variable 'controller_profile' from source: play vars 11124 1726882376.57361: variable 'controller_profile' from source: play vars 11124 1726882376.57386: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11124 1726882376.57390: when evaluation is False, skipping this task 11124 1726882376.57392: _execute() done 11124 1726882376.57395: dumping result to json 11124 1726882376.57397: done dumping result, returning 11124 1726882376.57404: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-8362-0f62-00000000002b] 11124 1726882376.57409: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002b 11124 1726882376.57501: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002b 11124 1726882376.57504: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11124 1726882376.57548: no more pending results, returning what we have 11124 1726882376.57553: results queue empty 11124 1726882376.57554: checking for any_errors_fatal 11124 1726882376.57565: done checking for any_errors_fatal 11124 1726882376.57565: checking for max_fail_percentage 11124 1726882376.57567: done checking for max_fail_percentage 11124 1726882376.57571: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.57573: done checking to see if all hosts have failed 11124 1726882376.57573: getting the remaining hosts for this loop 11124 1726882376.57575: done getting the remaining hosts for this loop 11124 1726882376.57578: getting the next task for host managed_node1 11124 1726882376.57584: done getting next task for host managed_node1 11124 1726882376.57588: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11124 1726882376.57591: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.57604: getting variables 11124 1726882376.57606: in VariableManager get_vars() 11124 1726882376.57641: Calling all_inventory to load vars for managed_node1 11124 1726882376.57643: Calling groups_inventory to load vars for managed_node1 11124 1726882376.57646: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.57657: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.57659: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.57662: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.58475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.59408: done with get_vars() 11124 1726882376.59425: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11124 1726882376.59481: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:32:56 -0400 (0:00:00.068) 0:00:16.837 ****** 11124 1726882376.59501: entering _queue_task() for managed_node1/yum 11124 1726882376.59503: Creating lock for yum 11124 1726882376.59719: worker is 1 (out of 1 available) 11124 1726882376.59732: exiting _queue_task() for managed_node1/yum 11124 1726882376.59743: done queuing things up, now waiting for results queue to drain 11124 1726882376.59745: waiting for pending results... 11124 1726882376.59901: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11124 1726882376.59980: in run() - task 0e448fcc-3ce9-8362-0f62-00000000002c 11124 1726882376.59991: variable 'ansible_search_path' from source: unknown 11124 1726882376.59994: variable 'ansible_search_path' from source: unknown 11124 1726882376.60022: calling self._execute() 11124 1726882376.60084: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.60088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.60095: variable 'omit' from source: magic vars 11124 1726882376.60342: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.60353: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.60467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882376.62158: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882376.62209: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882376.62237: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882376.62265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882376.62285: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882376.62338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.62362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.62380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.62431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.62435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.62496: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.62507: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11124 1726882376.62510: when evaluation is False, skipping this task 11124 1726882376.62512: _execute() done 11124 1726882376.62515: dumping result to json 11124 1726882376.62519: done dumping result, returning 11124 1726882376.62525: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-8362-0f62-00000000002c] 11124 1726882376.62530: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002c 11124 1726882376.62620: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002c 11124 1726882376.62623: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11124 1726882376.62673: no more pending results, returning what we have 11124 1726882376.62676: results queue empty 11124 1726882376.62677: checking for any_errors_fatal 11124 1726882376.62688: done checking for any_errors_fatal 11124 1726882376.62689: checking for max_fail_percentage 11124 1726882376.62691: done checking for max_fail_percentage 11124 1726882376.62691: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.62692: done checking to see if all hosts have failed 11124 1726882376.62693: getting the remaining hosts for this loop 11124 1726882376.62694: done getting the remaining hosts for this loop 11124 1726882376.62698: getting the next task for host managed_node1 11124 1726882376.62703: done getting next task for host managed_node1 11124 1726882376.62707: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11124 1726882376.62710: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.62723: getting variables 11124 1726882376.62725: in VariableManager get_vars() 11124 1726882376.62762: Calling all_inventory to load vars for managed_node1 11124 1726882376.62766: Calling groups_inventory to load vars for managed_node1 11124 1726882376.62768: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.62776: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.62778: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.62781: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.63691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.65268: done with get_vars() 11124 1726882376.65286: done getting variables 11124 1726882376.65328: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:32:56 -0400 (0:00:00.058) 0:00:16.896 ****** 11124 1726882376.65362: entering _queue_task() for managed_node1/fail 11124 1726882376.65586: worker is 1 (out of 1 available) 11124 1726882376.65600: exiting _queue_task() for managed_node1/fail 11124 1726882376.65613: done queuing things up, now waiting for results queue to drain 11124 1726882376.65615: waiting for pending results... 11124 1726882376.65782: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11124 1726882376.65867: in run() - task 0e448fcc-3ce9-8362-0f62-00000000002d 11124 1726882376.65882: variable 'ansible_search_path' from source: unknown 11124 1726882376.65885: variable 'ansible_search_path' from source: unknown 11124 1726882376.65917: calling self._execute() 11124 1726882376.65978: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.65982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.65989: variable 'omit' from source: magic vars 11124 1726882376.66262: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.66274: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.66352: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882376.66487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882376.68294: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882376.68357: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882376.68392: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882376.68425: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882376.68448: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882376.68527: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.68558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.68585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.68627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.68641: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.68692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.68714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.68738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.68782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.68794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.68832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.68858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.68885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.68920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.68933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.69069: variable 'network_connections' from source: task vars 11124 1726882376.69079: variable 'controller_profile' from source: play vars 11124 1726882376.69127: variable 'controller_profile' from source: play vars 11124 1726882376.69135: variable 'controller_device' from source: play vars 11124 1726882376.69188: variable 'controller_device' from source: play vars 11124 1726882376.69196: variable 'port1_profile' from source: play vars 11124 1726882376.69236: variable 'port1_profile' from source: play vars 11124 1726882376.69242: variable 'dhcp_interface1' from source: play vars 11124 1726882376.69290: variable 'dhcp_interface1' from source: play vars 11124 1726882376.69296: variable 'controller_profile' from source: play vars 11124 1726882376.69336: variable 'controller_profile' from source: play vars 11124 1726882376.69342: variable 'port2_profile' from source: play vars 11124 1726882376.69389: variable 'port2_profile' from source: play vars 11124 1726882376.69395: variable 'dhcp_interface2' from source: play vars 11124 1726882376.69435: variable 'dhcp_interface2' from source: play vars 11124 1726882376.69441: variable 'controller_profile' from source: play vars 11124 1726882376.69488: variable 'controller_profile' from source: play vars 11124 1726882376.69535: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882376.69663: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882376.69694: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882376.69717: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882376.69737: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882376.69771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882376.69786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882376.69808: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.69825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882376.69882: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882376.70037: variable 'network_connections' from source: task vars 11124 1726882376.70041: variable 'controller_profile' from source: play vars 11124 1726882376.70087: variable 'controller_profile' from source: play vars 11124 1726882376.70093: variable 'controller_device' from source: play vars 11124 1726882376.70135: variable 'controller_device' from source: play vars 11124 1726882376.70142: variable 'port1_profile' from source: play vars 11124 1726882376.70187: variable 'port1_profile' from source: play vars 11124 1726882376.70192: variable 'dhcp_interface1' from source: play vars 11124 1726882376.70234: variable 'dhcp_interface1' from source: play vars 11124 1726882376.70240: variable 'controller_profile' from source: play vars 11124 1726882376.70284: variable 'controller_profile' from source: play vars 11124 1726882376.70290: variable 'port2_profile' from source: play vars 11124 1726882376.70330: variable 'port2_profile' from source: play vars 11124 1726882376.70337: variable 'dhcp_interface2' from source: play vars 11124 1726882376.70382: variable 'dhcp_interface2' from source: play vars 11124 1726882376.70388: variable 'controller_profile' from source: play vars 11124 1726882376.70428: variable 'controller_profile' from source: play vars 11124 1726882376.70454: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11124 1726882376.70460: when evaluation is False, skipping this task 11124 1726882376.70465: _execute() done 11124 1726882376.70469: dumping result to json 11124 1726882376.70471: done dumping result, returning 11124 1726882376.70475: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-8362-0f62-00000000002d] 11124 1726882376.70480: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002d skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11124 1726882376.70615: no more pending results, returning what we have 11124 1726882376.70619: results queue empty 11124 1726882376.70620: checking for any_errors_fatal 11124 1726882376.70624: done checking for any_errors_fatal 11124 1726882376.70624: checking for max_fail_percentage 11124 1726882376.70626: done checking for max_fail_percentage 11124 1726882376.70627: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.70628: done checking to see if all hosts have failed 11124 1726882376.70629: getting the remaining hosts for this loop 11124 1726882376.70630: done getting the remaining hosts for this loop 11124 1726882376.70633: getting the next task for host managed_node1 11124 1726882376.70639: done getting next task for host managed_node1 11124 1726882376.70643: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11124 1726882376.70646: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.70659: getting variables 11124 1726882376.70661: in VariableManager get_vars() 11124 1726882376.70737: Calling all_inventory to load vars for managed_node1 11124 1726882376.70740: Calling groups_inventory to load vars for managed_node1 11124 1726882376.70742: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.70751: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.70754: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.70757: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.71370: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002d 11124 1726882376.71374: WORKER PROCESS EXITING 11124 1726882376.72338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.74156: done with get_vars() 11124 1726882376.74182: done getting variables 11124 1726882376.74248: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:32:56 -0400 (0:00:00.089) 0:00:16.985 ****** 11124 1726882376.74283: entering _queue_task() for managed_node1/package 11124 1726882376.74609: worker is 1 (out of 1 available) 11124 1726882376.74621: exiting _queue_task() for managed_node1/package 11124 1726882376.74642: done queuing things up, now waiting for results queue to drain 11124 1726882376.74644: waiting for pending results... 11124 1726882376.74929: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 11124 1726882376.75051: in run() - task 0e448fcc-3ce9-8362-0f62-00000000002e 11124 1726882376.75062: variable 'ansible_search_path' from source: unknown 11124 1726882376.75070: variable 'ansible_search_path' from source: unknown 11124 1726882376.75114: calling self._execute() 11124 1726882376.75202: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.75208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.75216: variable 'omit' from source: magic vars 11124 1726882376.75593: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.75609: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.75812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882376.76102: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882376.76145: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882376.76188: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882376.76220: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882376.76335: variable 'network_packages' from source: role '' defaults 11124 1726882376.76453: variable '__network_provider_setup' from source: role '' defaults 11124 1726882376.76461: variable '__network_service_name_default_nm' from source: role '' defaults 11124 1726882376.76536: variable '__network_service_name_default_nm' from source: role '' defaults 11124 1726882376.76544: variable '__network_packages_default_nm' from source: role '' defaults 11124 1726882376.76618: variable '__network_packages_default_nm' from source: role '' defaults 11124 1726882376.76842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882376.79007: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882376.79077: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882376.79122: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882376.79155: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882376.79181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882376.79265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.79293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.79329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.79369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.79384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.79436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.79459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.79485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.79533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.79547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.79802: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11124 1726882376.79924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.79954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.79984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.80023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.80037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.80134: variable 'ansible_python' from source: facts 11124 1726882376.80160: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11124 1726882376.80252: variable '__network_wpa_supplicant_required' from source: role '' defaults 11124 1726882376.80335: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11124 1726882376.80468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.80495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.80526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.80566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.80583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.80642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882376.80668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882376.80693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.80742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882376.80756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882376.80915: variable 'network_connections' from source: task vars 11124 1726882376.80925: variable 'controller_profile' from source: play vars 11124 1726882376.81036: variable 'controller_profile' from source: play vars 11124 1726882376.81054: variable 'controller_device' from source: play vars 11124 1726882376.81161: variable 'controller_device' from source: play vars 11124 1726882376.81173: variable 'port1_profile' from source: play vars 11124 1726882376.81279: variable 'port1_profile' from source: play vars 11124 1726882376.81288: variable 'dhcp_interface1' from source: play vars 11124 1726882376.81394: variable 'dhcp_interface1' from source: play vars 11124 1726882376.81403: variable 'controller_profile' from source: play vars 11124 1726882376.81509: variable 'controller_profile' from source: play vars 11124 1726882376.81519: variable 'port2_profile' from source: play vars 11124 1726882376.81625: variable 'port2_profile' from source: play vars 11124 1726882376.81634: variable 'dhcp_interface2' from source: play vars 11124 1726882376.81741: variable 'dhcp_interface2' from source: play vars 11124 1726882376.81752: variable 'controller_profile' from source: play vars 11124 1726882376.81858: variable 'controller_profile' from source: play vars 11124 1726882376.81937: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882376.81964: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882376.81993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882376.82033: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882376.82083: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882376.82394: variable 'network_connections' from source: task vars 11124 1726882376.82397: variable 'controller_profile' from source: play vars 11124 1726882376.82512: variable 'controller_profile' from source: play vars 11124 1726882376.82521: variable 'controller_device' from source: play vars 11124 1726882376.82632: variable 'controller_device' from source: play vars 11124 1726882376.82643: variable 'port1_profile' from source: play vars 11124 1726882376.82755: variable 'port1_profile' from source: play vars 11124 1726882376.82761: variable 'dhcp_interface1' from source: play vars 11124 1726882376.82875: variable 'dhcp_interface1' from source: play vars 11124 1726882376.82888: variable 'controller_profile' from source: play vars 11124 1726882376.82997: variable 'controller_profile' from source: play vars 11124 1726882376.83010: variable 'port2_profile' from source: play vars 11124 1726882376.83116: variable 'port2_profile' from source: play vars 11124 1726882376.83126: variable 'dhcp_interface2' from source: play vars 11124 1726882376.83231: variable 'dhcp_interface2' from source: play vars 11124 1726882376.83240: variable 'controller_profile' from source: play vars 11124 1726882376.83347: variable 'controller_profile' from source: play vars 11124 1726882376.83400: variable '__network_packages_default_wireless' from source: role '' defaults 11124 1726882376.83489: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882376.83835: variable 'network_connections' from source: task vars 11124 1726882376.83838: variable 'controller_profile' from source: play vars 11124 1726882376.83912: variable 'controller_profile' from source: play vars 11124 1726882376.83919: variable 'controller_device' from source: play vars 11124 1726882376.83991: variable 'controller_device' from source: play vars 11124 1726882376.84000: variable 'port1_profile' from source: play vars 11124 1726882376.84061: variable 'port1_profile' from source: play vars 11124 1726882376.84072: variable 'dhcp_interface1' from source: play vars 11124 1726882376.84141: variable 'dhcp_interface1' from source: play vars 11124 1726882376.84146: variable 'controller_profile' from source: play vars 11124 1726882376.84220: variable 'controller_profile' from source: play vars 11124 1726882376.84226: variable 'port2_profile' from source: play vars 11124 1726882376.84300: variable 'port2_profile' from source: play vars 11124 1726882376.84313: variable 'dhcp_interface2' from source: play vars 11124 1726882376.84377: variable 'dhcp_interface2' from source: play vars 11124 1726882376.84383: variable 'controller_profile' from source: play vars 11124 1726882376.84459: variable 'controller_profile' from source: play vars 11124 1726882376.84486: variable '__network_packages_default_team' from source: role '' defaults 11124 1726882376.84579: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882376.84904: variable 'network_connections' from source: task vars 11124 1726882376.84908: variable 'controller_profile' from source: play vars 11124 1726882376.84982: variable 'controller_profile' from source: play vars 11124 1726882376.84988: variable 'controller_device' from source: play vars 11124 1726882376.85054: variable 'controller_device' from source: play vars 11124 1726882376.85067: variable 'port1_profile' from source: play vars 11124 1726882376.85128: variable 'port1_profile' from source: play vars 11124 1726882376.85134: variable 'dhcp_interface1' from source: play vars 11124 1726882376.85206: variable 'dhcp_interface1' from source: play vars 11124 1726882376.85211: variable 'controller_profile' from source: play vars 11124 1726882376.85280: variable 'controller_profile' from source: play vars 11124 1726882376.85287: variable 'port2_profile' from source: play vars 11124 1726882376.85345: variable 'port2_profile' from source: play vars 11124 1726882376.85353: variable 'dhcp_interface2' from source: play vars 11124 1726882376.85422: variable 'dhcp_interface2' from source: play vars 11124 1726882376.85428: variable 'controller_profile' from source: play vars 11124 1726882376.85499: variable 'controller_profile' from source: play vars 11124 1726882376.85557: variable '__network_service_name_default_initscripts' from source: role '' defaults 11124 1726882376.85626: variable '__network_service_name_default_initscripts' from source: role '' defaults 11124 1726882376.85632: variable '__network_packages_default_initscripts' from source: role '' defaults 11124 1726882376.85692: variable '__network_packages_default_initscripts' from source: role '' defaults 11124 1726882376.85918: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11124 1726882376.86416: variable 'network_connections' from source: task vars 11124 1726882376.86419: variable 'controller_profile' from source: play vars 11124 1726882376.86490: variable 'controller_profile' from source: play vars 11124 1726882376.86497: variable 'controller_device' from source: play vars 11124 1726882376.86555: variable 'controller_device' from source: play vars 11124 1726882376.86571: variable 'port1_profile' from source: play vars 11124 1726882376.86631: variable 'port1_profile' from source: play vars 11124 1726882376.86637: variable 'dhcp_interface1' from source: play vars 11124 1726882376.86709: variable 'dhcp_interface1' from source: play vars 11124 1726882376.86715: variable 'controller_profile' from source: play vars 11124 1726882376.86774: variable 'controller_profile' from source: play vars 11124 1726882376.86781: variable 'port2_profile' from source: play vars 11124 1726882376.86846: variable 'port2_profile' from source: play vars 11124 1726882376.86854: variable 'dhcp_interface2' from source: play vars 11124 1726882376.86923: variable 'dhcp_interface2' from source: play vars 11124 1726882376.86928: variable 'controller_profile' from source: play vars 11124 1726882376.86993: variable 'controller_profile' from source: play vars 11124 1726882376.87006: variable 'ansible_distribution' from source: facts 11124 1726882376.87013: variable '__network_rh_distros' from source: role '' defaults 11124 1726882376.87020: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.87045: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11124 1726882376.87225: variable 'ansible_distribution' from source: facts 11124 1726882376.87234: variable '__network_rh_distros' from source: role '' defaults 11124 1726882376.87240: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.87254: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11124 1726882376.87422: variable 'ansible_distribution' from source: facts 11124 1726882376.87430: variable '__network_rh_distros' from source: role '' defaults 11124 1726882376.87440: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.87481: variable 'network_provider' from source: set_fact 11124 1726882376.87495: variable 'ansible_facts' from source: unknown 11124 1726882376.88084: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11124 1726882376.88087: when evaluation is False, skipping this task 11124 1726882376.88091: _execute() done 11124 1726882376.88093: dumping result to json 11124 1726882376.88095: done dumping result, returning 11124 1726882376.88102: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-8362-0f62-00000000002e] 11124 1726882376.88108: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002e 11124 1726882376.88195: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002e 11124 1726882376.88198: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11124 1726882376.88252: no more pending results, returning what we have 11124 1726882376.88256: results queue empty 11124 1726882376.88257: checking for any_errors_fatal 11124 1726882376.88265: done checking for any_errors_fatal 11124 1726882376.88266: checking for max_fail_percentage 11124 1726882376.88267: done checking for max_fail_percentage 11124 1726882376.88268: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.88269: done checking to see if all hosts have failed 11124 1726882376.88270: getting the remaining hosts for this loop 11124 1726882376.88271: done getting the remaining hosts for this loop 11124 1726882376.88275: getting the next task for host managed_node1 11124 1726882376.88281: done getting next task for host managed_node1 11124 1726882376.88285: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11124 1726882376.88289: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.88304: getting variables 11124 1726882376.88305: in VariableManager get_vars() 11124 1726882376.88346: Calling all_inventory to load vars for managed_node1 11124 1726882376.88351: Calling groups_inventory to load vars for managed_node1 11124 1726882376.88353: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.88362: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.88372: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.88376: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.89179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.90271: done with get_vars() 11124 1726882376.90296: done getting variables 11124 1726882376.90360: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:32:56 -0400 (0:00:00.161) 0:00:17.146 ****** 11124 1726882376.90396: entering _queue_task() for managed_node1/package 11124 1726882376.90691: worker is 1 (out of 1 available) 11124 1726882376.90705: exiting _queue_task() for managed_node1/package 11124 1726882376.90717: done queuing things up, now waiting for results queue to drain 11124 1726882376.90719: waiting for pending results... 11124 1726882376.90991: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11124 1726882376.91089: in run() - task 0e448fcc-3ce9-8362-0f62-00000000002f 11124 1726882376.91101: variable 'ansible_search_path' from source: unknown 11124 1726882376.91105: variable 'ansible_search_path' from source: unknown 11124 1726882376.91145: calling self._execute() 11124 1726882376.91208: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.91212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.91219: variable 'omit' from source: magic vars 11124 1726882376.91601: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.91619: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.91749: variable 'network_state' from source: role '' defaults 11124 1726882376.91765: Evaluated conditional (network_state != {}): False 11124 1726882376.91773: when evaluation is False, skipping this task 11124 1726882376.91780: _execute() done 11124 1726882376.91786: dumping result to json 11124 1726882376.91793: done dumping result, returning 11124 1726882376.91813: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-8362-0f62-00000000002f] 11124 1726882376.91824: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882376.91971: no more pending results, returning what we have 11124 1726882376.91975: results queue empty 11124 1726882376.91976: checking for any_errors_fatal 11124 1726882376.91981: done checking for any_errors_fatal 11124 1726882376.91982: checking for max_fail_percentage 11124 1726882376.91983: done checking for max_fail_percentage 11124 1726882376.91984: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.91985: done checking to see if all hosts have failed 11124 1726882376.91986: getting the remaining hosts for this loop 11124 1726882376.91987: done getting the remaining hosts for this loop 11124 1726882376.91991: getting the next task for host managed_node1 11124 1726882376.91998: done getting next task for host managed_node1 11124 1726882376.92002: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11124 1726882376.92006: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.92022: getting variables 11124 1726882376.92024: in VariableManager get_vars() 11124 1726882376.92076: Calling all_inventory to load vars for managed_node1 11124 1726882376.92079: Calling groups_inventory to load vars for managed_node1 11124 1726882376.92082: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.92096: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.92099: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.92103: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.95430: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000002f 11124 1726882376.95434: WORKER PROCESS EXITING 11124 1726882376.95801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.96704: done with get_vars() 11124 1726882376.96718: done getting variables 11124 1726882376.96753: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:32:56 -0400 (0:00:00.063) 0:00:17.210 ****** 11124 1726882376.96775: entering _queue_task() for managed_node1/package 11124 1726882376.96990: worker is 1 (out of 1 available) 11124 1726882376.97003: exiting _queue_task() for managed_node1/package 11124 1726882376.97016: done queuing things up, now waiting for results queue to drain 11124 1726882376.97018: waiting for pending results... 11124 1726882376.97193: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11124 1726882376.97269: in run() - task 0e448fcc-3ce9-8362-0f62-000000000030 11124 1726882376.97283: variable 'ansible_search_path' from source: unknown 11124 1726882376.97286: variable 'ansible_search_path' from source: unknown 11124 1726882376.97313: calling self._execute() 11124 1726882376.97387: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882376.97391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882376.97395: variable 'omit' from source: magic vars 11124 1726882376.97656: variable 'ansible_distribution_major_version' from source: facts 11124 1726882376.97666: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882376.97746: variable 'network_state' from source: role '' defaults 11124 1726882376.97754: Evaluated conditional (network_state != {}): False 11124 1726882376.97759: when evaluation is False, skipping this task 11124 1726882376.97761: _execute() done 11124 1726882376.97766: dumping result to json 11124 1726882376.97769: done dumping result, returning 11124 1726882376.97775: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-8362-0f62-000000000030] 11124 1726882376.97780: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000030 11124 1726882376.97877: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000030 11124 1726882376.97880: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882376.97932: no more pending results, returning what we have 11124 1726882376.97935: results queue empty 11124 1726882376.97936: checking for any_errors_fatal 11124 1726882376.97943: done checking for any_errors_fatal 11124 1726882376.97943: checking for max_fail_percentage 11124 1726882376.97945: done checking for max_fail_percentage 11124 1726882376.97945: checking to see if all hosts have failed and the running result is not ok 11124 1726882376.97946: done checking to see if all hosts have failed 11124 1726882376.97947: getting the remaining hosts for this loop 11124 1726882376.97951: done getting the remaining hosts for this loop 11124 1726882376.97954: getting the next task for host managed_node1 11124 1726882376.97959: done getting next task for host managed_node1 11124 1726882376.97963: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11124 1726882376.97967: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882376.97986: getting variables 11124 1726882376.97987: in VariableManager get_vars() 11124 1726882376.98021: Calling all_inventory to load vars for managed_node1 11124 1726882376.98023: Calling groups_inventory to load vars for managed_node1 11124 1726882376.98025: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882376.98033: Calling all_plugins_play to load vars for managed_node1 11124 1726882376.98034: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882376.98036: Calling groups_plugins_play to load vars for managed_node1 11124 1726882376.98794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882376.99824: done with get_vars() 11124 1726882376.99837: done getting variables 11124 1726882376.99907: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:32:56 -0400 (0:00:00.031) 0:00:17.241 ****** 11124 1726882376.99929: entering _queue_task() for managed_node1/service 11124 1726882376.99930: Creating lock for service 11124 1726882377.00122: worker is 1 (out of 1 available) 11124 1726882377.00136: exiting _queue_task() for managed_node1/service 11124 1726882377.00148: done queuing things up, now waiting for results queue to drain 11124 1726882377.00152: waiting for pending results... 11124 1726882377.00306: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11124 1726882377.00388: in run() - task 0e448fcc-3ce9-8362-0f62-000000000031 11124 1726882377.00399: variable 'ansible_search_path' from source: unknown 11124 1726882377.00403: variable 'ansible_search_path' from source: unknown 11124 1726882377.00429: calling self._execute() 11124 1726882377.00496: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882377.00500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882377.00508: variable 'omit' from source: magic vars 11124 1726882377.00755: variable 'ansible_distribution_major_version' from source: facts 11124 1726882377.00762: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882377.00839: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882377.00969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882377.02475: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882377.02524: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882377.02555: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882377.02580: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882377.02601: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882377.02657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.02680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.02698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.02723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.02734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.02767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.02786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.02803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.02827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.02838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.02867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.02886: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.02903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.02927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.02938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.03048: variable 'network_connections' from source: task vars 11124 1726882377.03057: variable 'controller_profile' from source: play vars 11124 1726882377.03108: variable 'controller_profile' from source: play vars 11124 1726882377.03115: variable 'controller_device' from source: play vars 11124 1726882377.03157: variable 'controller_device' from source: play vars 11124 1726882377.03166: variable 'port1_profile' from source: play vars 11124 1726882377.03212: variable 'port1_profile' from source: play vars 11124 1726882377.03217: variable 'dhcp_interface1' from source: play vars 11124 1726882377.03262: variable 'dhcp_interface1' from source: play vars 11124 1726882377.03268: variable 'controller_profile' from source: play vars 11124 1726882377.03311: variable 'controller_profile' from source: play vars 11124 1726882377.03319: variable 'port2_profile' from source: play vars 11124 1726882377.03361: variable 'port2_profile' from source: play vars 11124 1726882377.03368: variable 'dhcp_interface2' from source: play vars 11124 1726882377.03413: variable 'dhcp_interface2' from source: play vars 11124 1726882377.03416: variable 'controller_profile' from source: play vars 11124 1726882377.03461: variable 'controller_profile' from source: play vars 11124 1726882377.03506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882377.03623: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882377.03653: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882377.03674: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882377.03696: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882377.03729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882377.03746: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882377.03763: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.03790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882377.03841: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882377.03991: variable 'network_connections' from source: task vars 11124 1726882377.03994: variable 'controller_profile' from source: play vars 11124 1726882377.04036: variable 'controller_profile' from source: play vars 11124 1726882377.04041: variable 'controller_device' from source: play vars 11124 1726882377.04087: variable 'controller_device' from source: play vars 11124 1726882377.04094: variable 'port1_profile' from source: play vars 11124 1726882377.04135: variable 'port1_profile' from source: play vars 11124 1726882377.04141: variable 'dhcp_interface1' from source: play vars 11124 1726882377.04186: variable 'dhcp_interface1' from source: play vars 11124 1726882377.04191: variable 'controller_profile' from source: play vars 11124 1726882377.04233: variable 'controller_profile' from source: play vars 11124 1726882377.04238: variable 'port2_profile' from source: play vars 11124 1726882377.04285: variable 'port2_profile' from source: play vars 11124 1726882377.04288: variable 'dhcp_interface2' from source: play vars 11124 1726882377.04330: variable 'dhcp_interface2' from source: play vars 11124 1726882377.04335: variable 'controller_profile' from source: play vars 11124 1726882377.04381: variable 'controller_profile' from source: play vars 11124 1726882377.04404: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11124 1726882377.04407: when evaluation is False, skipping this task 11124 1726882377.04411: _execute() done 11124 1726882377.04413: dumping result to json 11124 1726882377.04415: done dumping result, returning 11124 1726882377.04422: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-8362-0f62-000000000031] 11124 1726882377.04427: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000031 11124 1726882377.04516: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000031 11124 1726882377.04519: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11124 1726882377.04567: no more pending results, returning what we have 11124 1726882377.04570: results queue empty 11124 1726882377.04571: checking for any_errors_fatal 11124 1726882377.04579: done checking for any_errors_fatal 11124 1726882377.04579: checking for max_fail_percentage 11124 1726882377.04581: done checking for max_fail_percentage 11124 1726882377.04581: checking to see if all hosts have failed and the running result is not ok 11124 1726882377.04582: done checking to see if all hosts have failed 11124 1726882377.04583: getting the remaining hosts for this loop 11124 1726882377.04584: done getting the remaining hosts for this loop 11124 1726882377.04588: getting the next task for host managed_node1 11124 1726882377.04597: done getting next task for host managed_node1 11124 1726882377.04602: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11124 1726882377.04608: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882377.04621: getting variables 11124 1726882377.04623: in VariableManager get_vars() 11124 1726882377.04662: Calling all_inventory to load vars for managed_node1 11124 1726882377.04667: Calling groups_inventory to load vars for managed_node1 11124 1726882377.04670: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882377.04678: Calling all_plugins_play to load vars for managed_node1 11124 1726882377.04680: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882377.04683: Calling groups_plugins_play to load vars for managed_node1 11124 1726882377.05502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882377.06445: done with get_vars() 11124 1726882377.06464: done getting variables 11124 1726882377.06505: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:32:57 -0400 (0:00:00.065) 0:00:17.307 ****** 11124 1726882377.06526: entering _queue_task() for managed_node1/service 11124 1726882377.06739: worker is 1 (out of 1 available) 11124 1726882377.06755: exiting _queue_task() for managed_node1/service 11124 1726882377.06770: done queuing things up, now waiting for results queue to drain 11124 1726882377.06772: waiting for pending results... 11124 1726882377.06937: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11124 1726882377.07016: in run() - task 0e448fcc-3ce9-8362-0f62-000000000032 11124 1726882377.07027: variable 'ansible_search_path' from source: unknown 11124 1726882377.07031: variable 'ansible_search_path' from source: unknown 11124 1726882377.07061: calling self._execute() 11124 1726882377.07129: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882377.07133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882377.07141: variable 'omit' from source: magic vars 11124 1726882377.07394: variable 'ansible_distribution_major_version' from source: facts 11124 1726882377.07404: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882377.07511: variable 'network_provider' from source: set_fact 11124 1726882377.07514: variable 'network_state' from source: role '' defaults 11124 1726882377.07522: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11124 1726882377.07533: variable 'omit' from source: magic vars 11124 1726882377.07570: variable 'omit' from source: magic vars 11124 1726882377.07589: variable 'network_service_name' from source: role '' defaults 11124 1726882377.07637: variable 'network_service_name' from source: role '' defaults 11124 1726882377.07710: variable '__network_provider_setup' from source: role '' defaults 11124 1726882377.07713: variable '__network_service_name_default_nm' from source: role '' defaults 11124 1726882377.07761: variable '__network_service_name_default_nm' from source: role '' defaults 11124 1726882377.07769: variable '__network_packages_default_nm' from source: role '' defaults 11124 1726882377.07812: variable '__network_packages_default_nm' from source: role '' defaults 11124 1726882377.07959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882377.09669: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882377.09719: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882377.09746: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882377.09773: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882377.09794: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882377.09852: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.09895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.09932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.09981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.10000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.10056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.10088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.10118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.10176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.10197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.10454: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11124 1726882377.10586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.10619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.10652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.10710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.10734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.10840: variable 'ansible_python' from source: facts 11124 1726882377.10875: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11124 1726882377.10939: variable '__network_wpa_supplicant_required' from source: role '' defaults 11124 1726882377.11009: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11124 1726882377.11093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.11110: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.11134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.11162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.11174: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.11205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.11228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.11242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.11273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.11284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.11376: variable 'network_connections' from source: task vars 11124 1726882377.11381: variable 'controller_profile' from source: play vars 11124 1726882377.11431: variable 'controller_profile' from source: play vars 11124 1726882377.11445: variable 'controller_device' from source: play vars 11124 1726882377.11501: variable 'controller_device' from source: play vars 11124 1726882377.11511: variable 'port1_profile' from source: play vars 11124 1726882377.11567: variable 'port1_profile' from source: play vars 11124 1726882377.11576: variable 'dhcp_interface1' from source: play vars 11124 1726882377.11626: variable 'dhcp_interface1' from source: play vars 11124 1726882377.11634: variable 'controller_profile' from source: play vars 11124 1726882377.11691: variable 'controller_profile' from source: play vars 11124 1726882377.11699: variable 'port2_profile' from source: play vars 11124 1726882377.11748: variable 'port2_profile' from source: play vars 11124 1726882377.11759: variable 'dhcp_interface2' from source: play vars 11124 1726882377.11811: variable 'dhcp_interface2' from source: play vars 11124 1726882377.11820: variable 'controller_profile' from source: play vars 11124 1726882377.11873: variable 'controller_profile' from source: play vars 11124 1726882377.11943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882377.12079: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882377.12115: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882377.12146: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882377.12178: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882377.12223: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882377.12244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882377.12271: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.12294: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882377.12332: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882377.12509: variable 'network_connections' from source: task vars 11124 1726882377.12515: variable 'controller_profile' from source: play vars 11124 1726882377.12572: variable 'controller_profile' from source: play vars 11124 1726882377.12580: variable 'controller_device' from source: play vars 11124 1726882377.12629: variable 'controller_device' from source: play vars 11124 1726882377.12644: variable 'port1_profile' from source: play vars 11124 1726882377.12695: variable 'port1_profile' from source: play vars 11124 1726882377.12703: variable 'dhcp_interface1' from source: play vars 11124 1726882377.12782: variable 'dhcp_interface1' from source: play vars 11124 1726882377.12785: variable 'controller_profile' from source: play vars 11124 1726882377.12850: variable 'controller_profile' from source: play vars 11124 1726882377.12862: variable 'port2_profile' from source: play vars 11124 1726882377.13482: variable 'port2_profile' from source: play vars 11124 1726882377.13485: variable 'dhcp_interface2' from source: play vars 11124 1726882377.13488: variable 'dhcp_interface2' from source: play vars 11124 1726882377.13490: variable 'controller_profile' from source: play vars 11124 1726882377.13492: variable 'controller_profile' from source: play vars 11124 1726882377.13494: variable '__network_packages_default_wireless' from source: role '' defaults 11124 1726882377.13496: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882377.13498: variable 'network_connections' from source: task vars 11124 1726882377.13500: variable 'controller_profile' from source: play vars 11124 1726882377.13569: variable 'controller_profile' from source: play vars 11124 1726882377.13572: variable 'controller_device' from source: play vars 11124 1726882377.13662: variable 'controller_device' from source: play vars 11124 1726882377.13667: variable 'port1_profile' from source: play vars 11124 1726882377.13699: variable 'port1_profile' from source: play vars 11124 1726882377.13812: variable 'dhcp_interface1' from source: play vars 11124 1726882377.13815: variable 'dhcp_interface1' from source: play vars 11124 1726882377.13818: variable 'controller_profile' from source: play vars 11124 1726882377.13921: variable 'controller_profile' from source: play vars 11124 1726882377.13924: variable 'port2_profile' from source: play vars 11124 1726882377.13927: variable 'port2_profile' from source: play vars 11124 1726882377.13929: variable 'dhcp_interface2' from source: play vars 11124 1726882377.14038: variable 'dhcp_interface2' from source: play vars 11124 1726882377.14042: variable 'controller_profile' from source: play vars 11124 1726882377.14147: variable 'controller_profile' from source: play vars 11124 1726882377.14150: variable '__network_packages_default_team' from source: role '' defaults 11124 1726882377.14152: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882377.14430: variable 'network_connections' from source: task vars 11124 1726882377.14434: variable 'controller_profile' from source: play vars 11124 1726882377.14506: variable 'controller_profile' from source: play vars 11124 1726882377.14513: variable 'controller_device' from source: play vars 11124 1726882377.14582: variable 'controller_device' from source: play vars 11124 1726882377.14591: variable 'port1_profile' from source: play vars 11124 1726882377.14660: variable 'port1_profile' from source: play vars 11124 1726882377.14668: variable 'dhcp_interface1' from source: play vars 11124 1726882377.14735: variable 'dhcp_interface1' from source: play vars 11124 1726882377.14741: variable 'controller_profile' from source: play vars 11124 1726882377.14812: variable 'controller_profile' from source: play vars 11124 1726882377.14818: variable 'port2_profile' from source: play vars 11124 1726882377.14889: variable 'port2_profile' from source: play vars 11124 1726882377.14896: variable 'dhcp_interface2' from source: play vars 11124 1726882377.14966: variable 'dhcp_interface2' from source: play vars 11124 1726882377.14972: variable 'controller_profile' from source: play vars 11124 1726882377.15060: variable 'controller_profile' from source: play vars 11124 1726882377.15125: variable '__network_service_name_default_initscripts' from source: role '' defaults 11124 1726882377.15183: variable '__network_service_name_default_initscripts' from source: role '' defaults 11124 1726882377.15189: variable '__network_packages_default_initscripts' from source: role '' defaults 11124 1726882377.15245: variable '__network_packages_default_initscripts' from source: role '' defaults 11124 1726882377.15488: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11124 1726882377.15933: variable 'network_connections' from source: task vars 11124 1726882377.15938: variable 'controller_profile' from source: play vars 11124 1726882377.16000: variable 'controller_profile' from source: play vars 11124 1726882377.16005: variable 'controller_device' from source: play vars 11124 1726882377.16066: variable 'controller_device' from source: play vars 11124 1726882377.16074: variable 'port1_profile' from source: play vars 11124 1726882377.16130: variable 'port1_profile' from source: play vars 11124 1726882377.16137: variable 'dhcp_interface1' from source: play vars 11124 1726882377.16197: variable 'dhcp_interface1' from source: play vars 11124 1726882377.16209: variable 'controller_profile' from source: play vars 11124 1726882377.16276: variable 'controller_profile' from source: play vars 11124 1726882377.16287: variable 'port2_profile' from source: play vars 11124 1726882377.16345: variable 'port2_profile' from source: play vars 11124 1726882377.16362: variable 'dhcp_interface2' from source: play vars 11124 1726882377.16415: variable 'dhcp_interface2' from source: play vars 11124 1726882377.16419: variable 'controller_profile' from source: play vars 11124 1726882377.16475: variable 'controller_profile' from source: play vars 11124 1726882377.16490: variable 'ansible_distribution' from source: facts 11124 1726882377.16493: variable '__network_rh_distros' from source: role '' defaults 11124 1726882377.16497: variable 'ansible_distribution_major_version' from source: facts 11124 1726882377.16517: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11124 1726882377.16635: variable 'ansible_distribution' from source: facts 11124 1726882377.16648: variable '__network_rh_distros' from source: role '' defaults 11124 1726882377.16651: variable 'ansible_distribution_major_version' from source: facts 11124 1726882377.16664: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11124 1726882377.16776: variable 'ansible_distribution' from source: facts 11124 1726882377.16780: variable '__network_rh_distros' from source: role '' defaults 11124 1726882377.16784: variable 'ansible_distribution_major_version' from source: facts 11124 1726882377.16812: variable 'network_provider' from source: set_fact 11124 1726882377.16828: variable 'omit' from source: magic vars 11124 1726882377.16850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882377.16876: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882377.16892: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882377.16908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882377.16914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882377.16935: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882377.16938: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882377.16941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882377.17016: Set connection var ansible_shell_executable to /bin/sh 11124 1726882377.17020: Set connection var ansible_shell_type to sh 11124 1726882377.17027: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882377.17032: Set connection var ansible_timeout to 10 11124 1726882377.17036: Set connection var ansible_pipelining to False 11124 1726882377.17039: Set connection var ansible_connection to ssh 11124 1726882377.17060: variable 'ansible_shell_executable' from source: unknown 11124 1726882377.17064: variable 'ansible_connection' from source: unknown 11124 1726882377.17067: variable 'ansible_module_compression' from source: unknown 11124 1726882377.17069: variable 'ansible_shell_type' from source: unknown 11124 1726882377.17074: variable 'ansible_shell_executable' from source: unknown 11124 1726882377.17076: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882377.17080: variable 'ansible_pipelining' from source: unknown 11124 1726882377.17082: variable 'ansible_timeout' from source: unknown 11124 1726882377.17086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882377.17158: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882377.17169: variable 'omit' from source: magic vars 11124 1726882377.17174: starting attempt loop 11124 1726882377.17176: running the handler 11124 1726882377.17233: variable 'ansible_facts' from source: unknown 11124 1726882377.17665: _low_level_execute_command(): starting 11124 1726882377.17669: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882377.18202: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882377.18211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882377.18222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882377.18235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882377.18280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882377.18284: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882377.18404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.18409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882377.18411: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882377.18414: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882377.18416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882377.18418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882377.18420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882377.18422: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882377.18424: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882377.18426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.18428: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882377.18447: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882377.18449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882377.18589: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882377.20250: stdout chunk (state=3): >>>/root <<< 11124 1726882377.20346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882377.20409: stderr chunk (state=3): >>><<< 11124 1726882377.20417: stdout chunk (state=3): >>><<< 11124 1726882377.20436: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882377.20448: _low_level_execute_command(): starting 11124 1726882377.20454: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939 `" && echo ansible-tmp-1726882377.2043676-11982-35942989136939="` echo /root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939 `" ) && sleep 0' 11124 1726882377.20937: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882377.20956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882377.20988: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.21006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882377.21009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.21079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882377.21095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882377.21109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882377.21244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882377.23114: stdout chunk (state=3): >>>ansible-tmp-1726882377.2043676-11982-35942989136939=/root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939 <<< 11124 1726882377.23229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882377.23326: stderr chunk (state=3): >>><<< 11124 1726882377.23338: stdout chunk (state=3): >>><<< 11124 1726882377.23437: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882377.2043676-11982-35942989136939=/root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882377.23441: variable 'ansible_module_compression' from source: unknown 11124 1726882377.23488: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11124 1726882377.23496: ANSIBALLZ: Acquiring lock 11124 1726882377.23503: ANSIBALLZ: Lock acquired: 139628947188928 11124 1726882377.23510: ANSIBALLZ: Creating module 11124 1726882377.49378: ANSIBALLZ: Writing module into payload 11124 1726882377.49516: ANSIBALLZ: Writing module 11124 1726882377.49543: ANSIBALLZ: Renaming module 11124 1726882377.49547: ANSIBALLZ: Done creating module 11124 1726882377.49567: variable 'ansible_facts' from source: unknown 11124 1726882377.49686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939/AnsiballZ_systemd.py 11124 1726882377.49798: Sending initial data 11124 1726882377.49802: Sent initial data (155 bytes) 11124 1726882377.50505: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882377.50511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882377.50540: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.50552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.50604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882377.50616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882377.50626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882377.50731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882377.52586: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882377.52700: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882377.52800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpgiacdut3 /root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939/AnsiballZ_systemd.py <<< 11124 1726882377.52892: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882377.55615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882377.55775: stderr chunk (state=3): >>><<< 11124 1726882377.55778: stdout chunk (state=3): >>><<< 11124 1726882377.55780: done transferring module to remote 11124 1726882377.55783: _low_level_execute_command(): starting 11124 1726882377.55785: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939/ /root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939/AnsiballZ_systemd.py && sleep 0' 11124 1726882377.56189: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882377.56195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882377.56225: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882377.56232: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882377.56241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.56253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882377.56256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882377.56267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882377.56276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882377.56281: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.56331: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882377.56358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882377.56361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882377.56454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882377.58218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882377.58270: stderr chunk (state=3): >>><<< 11124 1726882377.58273: stdout chunk (state=3): >>><<< 11124 1726882377.58287: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882377.58290: _low_level_execute_command(): starting 11124 1726882377.58295: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939/AnsiballZ_systemd.py && sleep 0' 11124 1726882377.58716: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882377.58721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882377.58751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.58766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882377.58777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.58826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882377.58834: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882377.58947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882377.84030: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "15618048", "MemoryAvailable": "infinity", "CPUUsageNSec": "419636000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "<<< 11124 1726882377.84073: stdout chunk (state=3): >>>0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11124 1726882377.85658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882377.85715: stderr chunk (state=3): >>><<< 11124 1726882377.85721: stdout chunk (state=3): >>><<< 11124 1726882377.85740: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "15618048", "MemoryAvailable": "infinity", "CPUUsageNSec": "419636000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882377.85857: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882377.85871: _low_level_execute_command(): starting 11124 1726882377.85876: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882377.2043676-11982-35942989136939/ > /dev/null 2>&1 && sleep 0' 11124 1726882377.86329: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882377.86334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882377.86371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.86383: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 11124 1726882377.86394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882377.86440: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882377.86451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882377.86556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882377.88355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882377.88400: stderr chunk (state=3): >>><<< 11124 1726882377.88403: stdout chunk (state=3): >>><<< 11124 1726882377.88420: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882377.88426: handler run complete 11124 1726882377.88467: attempt loop complete, returning result 11124 1726882377.88470: _execute() done 11124 1726882377.88473: dumping result to json 11124 1726882377.88484: done dumping result, returning 11124 1726882377.88492: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-8362-0f62-000000000032] 11124 1726882377.88496: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000032 11124 1726882377.88711: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000032 11124 1726882377.88714: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882377.88760: no more pending results, returning what we have 11124 1726882377.88763: results queue empty 11124 1726882377.88765: checking for any_errors_fatal 11124 1726882377.88771: done checking for any_errors_fatal 11124 1726882377.88771: checking for max_fail_percentage 11124 1726882377.88773: done checking for max_fail_percentage 11124 1726882377.88773: checking to see if all hosts have failed and the running result is not ok 11124 1726882377.88774: done checking to see if all hosts have failed 11124 1726882377.88775: getting the remaining hosts for this loop 11124 1726882377.88776: done getting the remaining hosts for this loop 11124 1726882377.88780: getting the next task for host managed_node1 11124 1726882377.88786: done getting next task for host managed_node1 11124 1726882377.88790: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11124 1726882377.88793: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882377.88802: getting variables 11124 1726882377.88804: in VariableManager get_vars() 11124 1726882377.88839: Calling all_inventory to load vars for managed_node1 11124 1726882377.88842: Calling groups_inventory to load vars for managed_node1 11124 1726882377.88844: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882377.88854: Calling all_plugins_play to load vars for managed_node1 11124 1726882377.88856: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882377.88859: Calling groups_plugins_play to load vars for managed_node1 11124 1726882377.89786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882377.90727: done with get_vars() 11124 1726882377.90742: done getting variables 11124 1726882377.90789: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:32:57 -0400 (0:00:00.842) 0:00:18.150 ****** 11124 1726882377.90814: entering _queue_task() for managed_node1/service 11124 1726882377.91044: worker is 1 (out of 1 available) 11124 1726882377.91060: exiting _queue_task() for managed_node1/service 11124 1726882377.91075: done queuing things up, now waiting for results queue to drain 11124 1726882377.91076: waiting for pending results... 11124 1726882377.91248: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11124 1726882377.91339: in run() - task 0e448fcc-3ce9-8362-0f62-000000000033 11124 1726882377.91355: variable 'ansible_search_path' from source: unknown 11124 1726882377.91358: variable 'ansible_search_path' from source: unknown 11124 1726882377.91386: calling self._execute() 11124 1726882377.91453: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882377.91457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882377.91464: variable 'omit' from source: magic vars 11124 1726882377.91734: variable 'ansible_distribution_major_version' from source: facts 11124 1726882377.91743: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882377.91822: variable 'network_provider' from source: set_fact 11124 1726882377.91825: Evaluated conditional (network_provider == "nm"): True 11124 1726882377.91893: variable '__network_wpa_supplicant_required' from source: role '' defaults 11124 1726882377.91955: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11124 1726882377.92068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882377.93542: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882377.93587: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882377.93615: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882377.93640: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882377.93660: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882377.93727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.93747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.93767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.93793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.93803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.93837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.93855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.93874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.93900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.93910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.93939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882377.93957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882377.93975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.93999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882377.94009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882377.94107: variable 'network_connections' from source: task vars 11124 1726882377.94115: variable 'controller_profile' from source: play vars 11124 1726882377.94162: variable 'controller_profile' from source: play vars 11124 1726882377.94172: variable 'controller_device' from source: play vars 11124 1726882377.94213: variable 'controller_device' from source: play vars 11124 1726882377.94221: variable 'port1_profile' from source: play vars 11124 1726882377.94266: variable 'port1_profile' from source: play vars 11124 1726882377.94272: variable 'dhcp_interface1' from source: play vars 11124 1726882377.94314: variable 'dhcp_interface1' from source: play vars 11124 1726882377.94321: variable 'controller_profile' from source: play vars 11124 1726882377.94364: variable 'controller_profile' from source: play vars 11124 1726882377.94372: variable 'port2_profile' from source: play vars 11124 1726882377.94413: variable 'port2_profile' from source: play vars 11124 1726882377.94419: variable 'dhcp_interface2' from source: play vars 11124 1726882377.94461: variable 'dhcp_interface2' from source: play vars 11124 1726882377.94469: variable 'controller_profile' from source: play vars 11124 1726882377.94512: variable 'controller_profile' from source: play vars 11124 1726882377.94559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882377.94672: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882377.94698: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882377.94722: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882377.94743: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882377.94774: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882377.94828: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882377.94831: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882377.94852: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882377.94889: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882377.95144: variable 'network_connections' from source: task vars 11124 1726882377.95157: variable 'controller_profile' from source: play vars 11124 1726882377.95196: variable 'controller_profile' from source: play vars 11124 1726882377.95202: variable 'controller_device' from source: play vars 11124 1726882377.95243: variable 'controller_device' from source: play vars 11124 1726882377.95252: variable 'port1_profile' from source: play vars 11124 1726882377.95295: variable 'port1_profile' from source: play vars 11124 1726882377.95301: variable 'dhcp_interface1' from source: play vars 11124 1726882377.95342: variable 'dhcp_interface1' from source: play vars 11124 1726882377.95347: variable 'controller_profile' from source: play vars 11124 1726882377.95392: variable 'controller_profile' from source: play vars 11124 1726882377.95398: variable 'port2_profile' from source: play vars 11124 1726882377.95439: variable 'port2_profile' from source: play vars 11124 1726882377.95445: variable 'dhcp_interface2' from source: play vars 11124 1726882377.95491: variable 'dhcp_interface2' from source: play vars 11124 1726882377.95495: variable 'controller_profile' from source: play vars 11124 1726882377.95537: variable 'controller_profile' from source: play vars 11124 1726882377.95568: Evaluated conditional (__network_wpa_supplicant_required): False 11124 1726882377.95571: when evaluation is False, skipping this task 11124 1726882377.95573: _execute() done 11124 1726882377.95576: dumping result to json 11124 1726882377.95578: done dumping result, returning 11124 1726882377.95590: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-8362-0f62-000000000033] 11124 1726882377.95593: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000033 11124 1726882377.95674: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000033 11124 1726882377.95676: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11124 1726882377.95756: no more pending results, returning what we have 11124 1726882377.95759: results queue empty 11124 1726882377.95760: checking for any_errors_fatal 11124 1726882377.95779: done checking for any_errors_fatal 11124 1726882377.95780: checking for max_fail_percentage 11124 1726882377.95782: done checking for max_fail_percentage 11124 1726882377.95783: checking to see if all hosts have failed and the running result is not ok 11124 1726882377.95783: done checking to see if all hosts have failed 11124 1726882377.95784: getting the remaining hosts for this loop 11124 1726882377.95786: done getting the remaining hosts for this loop 11124 1726882377.95789: getting the next task for host managed_node1 11124 1726882377.95794: done getting next task for host managed_node1 11124 1726882377.95798: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11124 1726882377.95806: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882377.95819: getting variables 11124 1726882377.95821: in VariableManager get_vars() 11124 1726882377.95860: Calling all_inventory to load vars for managed_node1 11124 1726882377.95865: Calling groups_inventory to load vars for managed_node1 11124 1726882377.95867: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882377.95876: Calling all_plugins_play to load vars for managed_node1 11124 1726882377.95878: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882377.95880: Calling groups_plugins_play to load vars for managed_node1 11124 1726882377.96747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882377.97695: done with get_vars() 11124 1726882377.97711: done getting variables 11124 1726882377.97756: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:32:57 -0400 (0:00:00.069) 0:00:18.220 ****** 11124 1726882377.97779: entering _queue_task() for managed_node1/service 11124 1726882377.97995: worker is 1 (out of 1 available) 11124 1726882377.98007: exiting _queue_task() for managed_node1/service 11124 1726882377.98020: done queuing things up, now waiting for results queue to drain 11124 1726882377.98022: waiting for pending results... 11124 1726882377.98197: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 11124 1726882377.98284: in run() - task 0e448fcc-3ce9-8362-0f62-000000000034 11124 1726882377.98298: variable 'ansible_search_path' from source: unknown 11124 1726882377.98301: variable 'ansible_search_path' from source: unknown 11124 1726882377.98332: calling self._execute() 11124 1726882377.98398: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882377.98403: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882377.98409: variable 'omit' from source: magic vars 11124 1726882377.98678: variable 'ansible_distribution_major_version' from source: facts 11124 1726882377.98687: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882377.98765: variable 'network_provider' from source: set_fact 11124 1726882377.98768: Evaluated conditional (network_provider == "initscripts"): False 11124 1726882377.98772: when evaluation is False, skipping this task 11124 1726882377.98774: _execute() done 11124 1726882377.98776: dumping result to json 11124 1726882377.98781: done dumping result, returning 11124 1726882377.98788: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-8362-0f62-000000000034] 11124 1726882377.98793: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000034 11124 1726882377.98875: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000034 11124 1726882377.98877: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882377.98941: no more pending results, returning what we have 11124 1726882377.98945: results queue empty 11124 1726882377.98945: checking for any_errors_fatal 11124 1726882377.98950: done checking for any_errors_fatal 11124 1726882377.98951: checking for max_fail_percentage 11124 1726882377.98952: done checking for max_fail_percentage 11124 1726882377.98953: checking to see if all hosts have failed and the running result is not ok 11124 1726882377.98954: done checking to see if all hosts have failed 11124 1726882377.98955: getting the remaining hosts for this loop 11124 1726882377.98956: done getting the remaining hosts for this loop 11124 1726882377.98959: getting the next task for host managed_node1 11124 1726882377.98965: done getting next task for host managed_node1 11124 1726882377.98969: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11124 1726882377.98972: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882377.98986: getting variables 11124 1726882377.98993: in VariableManager get_vars() 11124 1726882377.99025: Calling all_inventory to load vars for managed_node1 11124 1726882377.99027: Calling groups_inventory to load vars for managed_node1 11124 1726882377.99029: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882377.99036: Calling all_plugins_play to load vars for managed_node1 11124 1726882377.99037: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882377.99039: Calling groups_plugins_play to load vars for managed_node1 11124 1726882377.99807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882378.00735: done with get_vars() 11124 1726882378.00751: done getting variables 11124 1726882378.00795: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:32:58 -0400 (0:00:00.030) 0:00:18.250 ****** 11124 1726882378.00819: entering _queue_task() for managed_node1/copy 11124 1726882378.01043: worker is 1 (out of 1 available) 11124 1726882378.01059: exiting _queue_task() for managed_node1/copy 11124 1726882378.01072: done queuing things up, now waiting for results queue to drain 11124 1726882378.01074: waiting for pending results... 11124 1726882378.01251: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11124 1726882378.01344: in run() - task 0e448fcc-3ce9-8362-0f62-000000000035 11124 1726882378.01358: variable 'ansible_search_path' from source: unknown 11124 1726882378.01362: variable 'ansible_search_path' from source: unknown 11124 1726882378.01397: calling self._execute() 11124 1726882378.01469: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882378.01473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882378.01477: variable 'omit' from source: magic vars 11124 1726882378.01745: variable 'ansible_distribution_major_version' from source: facts 11124 1726882378.01758: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882378.01923: variable 'network_provider' from source: set_fact 11124 1726882378.01926: Evaluated conditional (network_provider == "initscripts"): False 11124 1726882378.01929: when evaluation is False, skipping this task 11124 1726882378.01931: _execute() done 11124 1726882378.01933: dumping result to json 11124 1726882378.01935: done dumping result, returning 11124 1726882378.01938: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-8362-0f62-000000000035] 11124 1726882378.01940: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000035 11124 1726882378.02042: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000035 11124 1726882378.02052: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11124 1726882378.02102: no more pending results, returning what we have 11124 1726882378.02106: results queue empty 11124 1726882378.02107: checking for any_errors_fatal 11124 1726882378.02114: done checking for any_errors_fatal 11124 1726882378.02116: checking for max_fail_percentage 11124 1726882378.02118: done checking for max_fail_percentage 11124 1726882378.02119: checking to see if all hosts have failed and the running result is not ok 11124 1726882378.02120: done checking to see if all hosts have failed 11124 1726882378.02120: getting the remaining hosts for this loop 11124 1726882378.02121: done getting the remaining hosts for this loop 11124 1726882378.02125: getting the next task for host managed_node1 11124 1726882378.02132: done getting next task for host managed_node1 11124 1726882378.02136: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11124 1726882378.02257: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882378.02277: getting variables 11124 1726882378.02280: in VariableManager get_vars() 11124 1726882378.02325: Calling all_inventory to load vars for managed_node1 11124 1726882378.02328: Calling groups_inventory to load vars for managed_node1 11124 1726882378.02331: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882378.02344: Calling all_plugins_play to load vars for managed_node1 11124 1726882378.02347: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882378.02349: Calling groups_plugins_play to load vars for managed_node1 11124 1726882378.03700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882378.04626: done with get_vars() 11124 1726882378.04643: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:32:58 -0400 (0:00:00.038) 0:00:18.289 ****** 11124 1726882378.04707: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11124 1726882378.04708: Creating lock for fedora.linux_system_roles.network_connections 11124 1726882378.04933: worker is 1 (out of 1 available) 11124 1726882378.04946: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11124 1726882378.04960: done queuing things up, now waiting for results queue to drain 11124 1726882378.04961: waiting for pending results... 11124 1726882378.05204: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11124 1726882378.05347: in run() - task 0e448fcc-3ce9-8362-0f62-000000000036 11124 1726882378.05372: variable 'ansible_search_path' from source: unknown 11124 1726882378.05380: variable 'ansible_search_path' from source: unknown 11124 1726882378.05422: calling self._execute() 11124 1726882378.05515: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882378.05526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882378.05541: variable 'omit' from source: magic vars 11124 1726882378.05902: variable 'ansible_distribution_major_version' from source: facts 11124 1726882378.05918: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882378.05930: variable 'omit' from source: magic vars 11124 1726882378.06363: variable 'omit' from source: magic vars 11124 1726882378.06526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882378.08819: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882378.08908: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882378.09271: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882378.09274: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882378.09277: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882378.09279: variable 'network_provider' from source: set_fact 11124 1726882378.09282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882378.09285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882378.09287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882378.09328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882378.09346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882378.09424: variable 'omit' from source: magic vars 11124 1726882378.09540: variable 'omit' from source: magic vars 11124 1726882378.09643: variable 'network_connections' from source: task vars 11124 1726882378.09665: variable 'controller_profile' from source: play vars 11124 1726882378.09720: variable 'controller_profile' from source: play vars 11124 1726882378.09727: variable 'controller_device' from source: play vars 11124 1726882378.09791: variable 'controller_device' from source: play vars 11124 1726882378.09798: variable 'port1_profile' from source: play vars 11124 1726882378.09848: variable 'port1_profile' from source: play vars 11124 1726882378.09858: variable 'dhcp_interface1' from source: play vars 11124 1726882378.09919: variable 'dhcp_interface1' from source: play vars 11124 1726882378.09924: variable 'controller_profile' from source: play vars 11124 1726882378.09982: variable 'controller_profile' from source: play vars 11124 1726882378.09993: variable 'port2_profile' from source: play vars 11124 1726882378.10048: variable 'port2_profile' from source: play vars 11124 1726882378.10058: variable 'dhcp_interface2' from source: play vars 11124 1726882378.10121: variable 'dhcp_interface2' from source: play vars 11124 1726882378.10127: variable 'controller_profile' from source: play vars 11124 1726882378.10185: variable 'controller_profile' from source: play vars 11124 1726882378.10380: variable 'omit' from source: magic vars 11124 1726882378.10388: variable '__lsr_ansible_managed' from source: task vars 11124 1726882378.10451: variable '__lsr_ansible_managed' from source: task vars 11124 1726882378.10635: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11124 1726882378.10862: Loaded config def from plugin (lookup/template) 11124 1726882378.10869: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11124 1726882378.10897: File lookup term: get_ansible_managed.j2 11124 1726882378.10900: variable 'ansible_search_path' from source: unknown 11124 1726882378.10904: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11124 1726882378.10917: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11124 1726882378.10934: variable 'ansible_search_path' from source: unknown 11124 1726882378.17280: variable 'ansible_managed' from source: unknown 11124 1726882378.17361: variable 'omit' from source: magic vars 11124 1726882378.17384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882378.17403: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882378.17417: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882378.17431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882378.17440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882378.17462: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882378.17466: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882378.17468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882378.17534: Set connection var ansible_shell_executable to /bin/sh 11124 1726882378.17537: Set connection var ansible_shell_type to sh 11124 1726882378.17544: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882378.17551: Set connection var ansible_timeout to 10 11124 1726882378.17562: Set connection var ansible_pipelining to False 11124 1726882378.17566: Set connection var ansible_connection to ssh 11124 1726882378.17579: variable 'ansible_shell_executable' from source: unknown 11124 1726882378.17582: variable 'ansible_connection' from source: unknown 11124 1726882378.17585: variable 'ansible_module_compression' from source: unknown 11124 1726882378.17587: variable 'ansible_shell_type' from source: unknown 11124 1726882378.17590: variable 'ansible_shell_executable' from source: unknown 11124 1726882378.17592: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882378.17594: variable 'ansible_pipelining' from source: unknown 11124 1726882378.17596: variable 'ansible_timeout' from source: unknown 11124 1726882378.17598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882378.17687: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882378.17695: variable 'omit' from source: magic vars 11124 1726882378.17701: starting attempt loop 11124 1726882378.17704: running the handler 11124 1726882378.17717: _low_level_execute_command(): starting 11124 1726882378.17723: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882378.18227: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882378.18235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882378.18271: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882378.18284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882378.18296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.18336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882378.18348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882378.18459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882378.20119: stdout chunk (state=3): >>>/root <<< 11124 1726882378.20224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882378.20271: stderr chunk (state=3): >>><<< 11124 1726882378.20274: stdout chunk (state=3): >>><<< 11124 1726882378.20292: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882378.20302: _low_level_execute_command(): starting 11124 1726882378.20307: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620 `" && echo ansible-tmp-1726882378.2029243-12009-28078133759620="` echo /root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620 `" ) && sleep 0' 11124 1726882378.20730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882378.20735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882378.20785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.20789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882378.20791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.20846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882378.20855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882378.20858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882378.20948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882378.22829: stdout chunk (state=3): >>>ansible-tmp-1726882378.2029243-12009-28078133759620=/root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620 <<< 11124 1726882378.22943: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882378.22994: stderr chunk (state=3): >>><<< 11124 1726882378.23001: stdout chunk (state=3): >>><<< 11124 1726882378.23043: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882378.2029243-12009-28078133759620=/root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882378.23057: variable 'ansible_module_compression' from source: unknown 11124 1726882378.23101: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11124 1726882378.23116: ANSIBALLZ: Acquiring lock 11124 1726882378.23119: ANSIBALLZ: Lock acquired: 139628945094032 11124 1726882378.23121: ANSIBALLZ: Creating module 11124 1726882378.38890: ANSIBALLZ: Writing module into payload 11124 1726882378.39223: ANSIBALLZ: Writing module 11124 1726882378.39247: ANSIBALLZ: Renaming module 11124 1726882378.39253: ANSIBALLZ: Done creating module 11124 1726882378.39272: variable 'ansible_facts' from source: unknown 11124 1726882378.39335: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620/AnsiballZ_network_connections.py 11124 1726882378.39439: Sending initial data 11124 1726882378.39443: Sent initial data (167 bytes) 11124 1726882378.40128: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882378.40134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882378.40169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.40181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.40229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882378.40247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882378.40358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882378.42193: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882378.42287: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882378.42382: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp3fzt0pk9 /root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620/AnsiballZ_network_connections.py <<< 11124 1726882378.42472: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882378.43789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882378.43887: stderr chunk (state=3): >>><<< 11124 1726882378.43891: stdout chunk (state=3): >>><<< 11124 1726882378.43908: done transferring module to remote 11124 1726882378.43917: _low_level_execute_command(): starting 11124 1726882378.43922: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620/ /root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620/AnsiballZ_network_connections.py && sleep 0' 11124 1726882378.44355: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882378.44361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882378.44393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.44405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.44462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882378.44473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882378.44576: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882378.46321: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882378.46372: stderr chunk (state=3): >>><<< 11124 1726882378.46376: stdout chunk (state=3): >>><<< 11124 1726882378.46389: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882378.46392: _low_level_execute_command(): starting 11124 1726882378.46397: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620/AnsiballZ_network_connections.py && sleep 0' 11124 1726882378.46831: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882378.46837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882378.46872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.46883: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.46941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882378.46952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882378.47066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882378.86506: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 11124 1726882378.86666: stdout chunk (state=3): >>> <<< 11124 1726882378.88409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882378.88461: stderr chunk (state=3): >>><<< 11124 1726882378.88466: stdout chunk (state=3): >>><<< 11124 1726882378.88484: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882378.88526: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'deprecated-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'master': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'master': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882378.88534: _low_level_execute_command(): starting 11124 1726882378.88538: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882378.2029243-12009-28078133759620/ > /dev/null 2>&1 && sleep 0' 11124 1726882378.88978: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882378.88983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882378.89035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.89038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882378.89045: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882378.89047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882378.89052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882378.89091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882378.89103: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882378.89205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882378.91171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882378.91206: stderr chunk (state=3): >>><<< 11124 1726882378.91209: stdout chunk (state=3): >>><<< 11124 1726882378.91226: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882378.91232: handler run complete 11124 1726882378.91279: attempt loop complete, returning result 11124 1726882378.91284: _execute() done 11124 1726882378.91287: dumping result to json 11124 1726882378.91293: done dumping result, returning 11124 1726882378.91303: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-8362-0f62-000000000036] 11124 1726882378.91306: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000036 11124 1726882378.91440: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000036 11124 1726882378.91443: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf (not-active) 11124 1726882378.91572: no more pending results, returning what we have 11124 1726882378.91576: results queue empty 11124 1726882378.91577: checking for any_errors_fatal 11124 1726882378.91584: done checking for any_errors_fatal 11124 1726882378.91585: checking for max_fail_percentage 11124 1726882378.91586: done checking for max_fail_percentage 11124 1726882378.91587: checking to see if all hosts have failed and the running result is not ok 11124 1726882378.91588: done checking to see if all hosts have failed 11124 1726882378.91588: getting the remaining hosts for this loop 11124 1726882378.91589: done getting the remaining hosts for this loop 11124 1726882378.91593: getting the next task for host managed_node1 11124 1726882378.91598: done getting next task for host managed_node1 11124 1726882378.91602: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11124 1726882378.91604: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882378.91613: getting variables 11124 1726882378.91614: in VariableManager get_vars() 11124 1726882378.91652: Calling all_inventory to load vars for managed_node1 11124 1726882378.91654: Calling groups_inventory to load vars for managed_node1 11124 1726882378.91656: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882378.91667: Calling all_plugins_play to load vars for managed_node1 11124 1726882378.91670: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882378.91672: Calling groups_plugins_play to load vars for managed_node1 11124 1726882378.93052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882378.94079: done with get_vars() 11124 1726882378.94094: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:32:58 -0400 (0:00:00.894) 0:00:19.184 ****** 11124 1726882378.94153: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11124 1726882378.94155: Creating lock for fedora.linux_system_roles.network_state 11124 1726882378.94379: worker is 1 (out of 1 available) 11124 1726882378.94393: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11124 1726882378.94405: done queuing things up, now waiting for results queue to drain 11124 1726882378.94407: waiting for pending results... 11124 1726882378.94596: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 11124 1726882378.94731: in run() - task 0e448fcc-3ce9-8362-0f62-000000000037 11124 1726882378.94752: variable 'ansible_search_path' from source: unknown 11124 1726882378.94765: variable 'ansible_search_path' from source: unknown 11124 1726882378.94812: calling self._execute() 11124 1726882378.94914: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882378.94928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882378.94949: variable 'omit' from source: magic vars 11124 1726882378.95344: variable 'ansible_distribution_major_version' from source: facts 11124 1726882378.95370: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882378.95508: variable 'network_state' from source: role '' defaults 11124 1726882378.95527: Evaluated conditional (network_state != {}): False 11124 1726882378.95535: when evaluation is False, skipping this task 11124 1726882378.95551: _execute() done 11124 1726882378.95558: dumping result to json 11124 1726882378.95574: done dumping result, returning 11124 1726882378.95580: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-8362-0f62-000000000037] 11124 1726882378.95583: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000037 11124 1726882378.95678: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000037 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882378.95745: no more pending results, returning what we have 11124 1726882378.95749: results queue empty 11124 1726882378.95749: checking for any_errors_fatal 11124 1726882378.95760: done checking for any_errors_fatal 11124 1726882378.95761: checking for max_fail_percentage 11124 1726882378.95765: done checking for max_fail_percentage 11124 1726882378.95766: checking to see if all hosts have failed and the running result is not ok 11124 1726882378.95767: done checking to see if all hosts have failed 11124 1726882378.95767: getting the remaining hosts for this loop 11124 1726882378.95769: done getting the remaining hosts for this loop 11124 1726882378.95772: getting the next task for host managed_node1 11124 1726882378.95778: done getting next task for host managed_node1 11124 1726882378.95781: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11124 1726882378.95784: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882378.95797: getting variables 11124 1726882378.95799: in VariableManager get_vars() 11124 1726882378.95833: Calling all_inventory to load vars for managed_node1 11124 1726882378.95836: Calling groups_inventory to load vars for managed_node1 11124 1726882378.95838: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882378.95846: Calling all_plugins_play to load vars for managed_node1 11124 1726882378.95849: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882378.95851: Calling groups_plugins_play to load vars for managed_node1 11124 1726882378.96627: WORKER PROCESS EXITING 11124 1726882378.96645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882378.97594: done with get_vars() 11124 1726882378.97611: done getting variables 11124 1726882378.97690: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:32:58 -0400 (0:00:00.035) 0:00:19.219 ****** 11124 1726882378.97733: entering _queue_task() for managed_node1/debug 11124 1726882378.98001: worker is 1 (out of 1 available) 11124 1726882378.98014: exiting _queue_task() for managed_node1/debug 11124 1726882378.98026: done queuing things up, now waiting for results queue to drain 11124 1726882378.98028: waiting for pending results... 11124 1726882378.98315: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11124 1726882378.98452: in run() - task 0e448fcc-3ce9-8362-0f62-000000000038 11124 1726882378.98479: variable 'ansible_search_path' from source: unknown 11124 1726882378.98487: variable 'ansible_search_path' from source: unknown 11124 1726882378.98529: calling self._execute() 11124 1726882378.98626: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882378.98638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882378.98656: variable 'omit' from source: magic vars 11124 1726882378.99030: variable 'ansible_distribution_major_version' from source: facts 11124 1726882378.99047: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882378.99063: variable 'omit' from source: magic vars 11124 1726882378.99125: variable 'omit' from source: magic vars 11124 1726882378.99168: variable 'omit' from source: magic vars 11124 1726882378.99213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882378.99256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882378.99283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882378.99307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882378.99327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882378.99353: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882378.99358: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882378.99361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882378.99446: Set connection var ansible_shell_executable to /bin/sh 11124 1726882378.99456: Set connection var ansible_shell_type to sh 11124 1726882378.99464: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882378.99470: Set connection var ansible_timeout to 10 11124 1726882378.99475: Set connection var ansible_pipelining to False 11124 1726882378.99477: Set connection var ansible_connection to ssh 11124 1726882378.99493: variable 'ansible_shell_executable' from source: unknown 11124 1726882378.99495: variable 'ansible_connection' from source: unknown 11124 1726882378.99498: variable 'ansible_module_compression' from source: unknown 11124 1726882378.99500: variable 'ansible_shell_type' from source: unknown 11124 1726882378.99503: variable 'ansible_shell_executable' from source: unknown 11124 1726882378.99505: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882378.99508: variable 'ansible_pipelining' from source: unknown 11124 1726882378.99511: variable 'ansible_timeout' from source: unknown 11124 1726882378.99515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882378.99625: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882378.99634: variable 'omit' from source: magic vars 11124 1726882378.99638: starting attempt loop 11124 1726882378.99641: running the handler 11124 1726882378.99737: variable '__network_connections_result' from source: set_fact 11124 1726882378.99791: handler run complete 11124 1726882378.99804: attempt loop complete, returning result 11124 1726882378.99807: _execute() done 11124 1726882378.99810: dumping result to json 11124 1726882378.99812: done dumping result, returning 11124 1726882378.99820: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-8362-0f62-000000000038] 11124 1726882378.99824: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000038 11124 1726882378.99907: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000038 11124 1726882378.99909: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf (not-active)" ] } 11124 1726882378.99970: no more pending results, returning what we have 11124 1726882378.99974: results queue empty 11124 1726882378.99975: checking for any_errors_fatal 11124 1726882378.99981: done checking for any_errors_fatal 11124 1726882378.99982: checking for max_fail_percentage 11124 1726882378.99984: done checking for max_fail_percentage 11124 1726882378.99985: checking to see if all hosts have failed and the running result is not ok 11124 1726882378.99986: done checking to see if all hosts have failed 11124 1726882378.99987: getting the remaining hosts for this loop 11124 1726882378.99988: done getting the remaining hosts for this loop 11124 1726882378.99992: getting the next task for host managed_node1 11124 1726882378.99998: done getting next task for host managed_node1 11124 1726882379.00002: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11124 1726882379.00004: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882379.00014: getting variables 11124 1726882379.00016: in VariableManager get_vars() 11124 1726882379.00051: Calling all_inventory to load vars for managed_node1 11124 1726882379.00054: Calling groups_inventory to load vars for managed_node1 11124 1726882379.00057: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882379.00072: Calling all_plugins_play to load vars for managed_node1 11124 1726882379.00075: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882379.00078: Calling groups_plugins_play to load vars for managed_node1 11124 1726882379.00979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882379.02328: done with get_vars() 11124 1726882379.02349: done getting variables 11124 1726882379.02411: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:32:59 -0400 (0:00:00.047) 0:00:19.266 ****** 11124 1726882379.02438: entering _queue_task() for managed_node1/debug 11124 1726882379.02677: worker is 1 (out of 1 available) 11124 1726882379.02690: exiting _queue_task() for managed_node1/debug 11124 1726882379.02702: done queuing things up, now waiting for results queue to drain 11124 1726882379.02704: waiting for pending results... 11124 1726882379.02916: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11124 1726882379.03053: in run() - task 0e448fcc-3ce9-8362-0f62-000000000039 11124 1726882379.03079: variable 'ansible_search_path' from source: unknown 11124 1726882379.03087: variable 'ansible_search_path' from source: unknown 11124 1726882379.03130: calling self._execute() 11124 1726882379.03236: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.03250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.03270: variable 'omit' from source: magic vars 11124 1726882379.03632: variable 'ansible_distribution_major_version' from source: facts 11124 1726882379.03649: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882379.03665: variable 'omit' from source: magic vars 11124 1726882379.03732: variable 'omit' from source: magic vars 11124 1726882379.03772: variable 'omit' from source: magic vars 11124 1726882379.03824: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882379.03865: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882379.03880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882379.03895: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882379.03906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882379.03942: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882379.03946: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.03948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.04018: Set connection var ansible_shell_executable to /bin/sh 11124 1726882379.04024: Set connection var ansible_shell_type to sh 11124 1726882379.04031: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882379.04036: Set connection var ansible_timeout to 10 11124 1726882379.04041: Set connection var ansible_pipelining to False 11124 1726882379.04043: Set connection var ansible_connection to ssh 11124 1726882379.04060: variable 'ansible_shell_executable' from source: unknown 11124 1726882379.04064: variable 'ansible_connection' from source: unknown 11124 1726882379.04067: variable 'ansible_module_compression' from source: unknown 11124 1726882379.04069: variable 'ansible_shell_type' from source: unknown 11124 1726882379.04072: variable 'ansible_shell_executable' from source: unknown 11124 1726882379.04074: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.04076: variable 'ansible_pipelining' from source: unknown 11124 1726882379.04079: variable 'ansible_timeout' from source: unknown 11124 1726882379.04083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.04185: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882379.04192: variable 'omit' from source: magic vars 11124 1726882379.04197: starting attempt loop 11124 1726882379.04199: running the handler 11124 1726882379.04241: variable '__network_connections_result' from source: set_fact 11124 1726882379.04293: variable '__network_connections_result' from source: set_fact 11124 1726882379.04403: handler run complete 11124 1726882379.04423: attempt loop complete, returning result 11124 1726882379.04426: _execute() done 11124 1726882379.04430: dumping result to json 11124 1726882379.04435: done dumping result, returning 11124 1726882379.04443: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-8362-0f62-000000000039] 11124 1726882379.04445: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000039 11124 1726882379.04534: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000039 11124 1726882379.04537: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a7e63e00-15bf-4393-a589-86c57260d9e4 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 8ae45e77-4d55-4f5d-855c-022ad1860e4a (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 5930b051-7181-4874-ab99-c3feeaedcfbf (not-active)" ] } } 11124 1726882379.04660: no more pending results, returning what we have 11124 1726882379.04665: results queue empty 11124 1726882379.04670: checking for any_errors_fatal 11124 1726882379.04676: done checking for any_errors_fatal 11124 1726882379.04676: checking for max_fail_percentage 11124 1726882379.04678: done checking for max_fail_percentage 11124 1726882379.04678: checking to see if all hosts have failed and the running result is not ok 11124 1726882379.04679: done checking to see if all hosts have failed 11124 1726882379.04680: getting the remaining hosts for this loop 11124 1726882379.04681: done getting the remaining hosts for this loop 11124 1726882379.04684: getting the next task for host managed_node1 11124 1726882379.04689: done getting next task for host managed_node1 11124 1726882379.04692: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11124 1726882379.04695: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882379.04704: getting variables 11124 1726882379.04705: in VariableManager get_vars() 11124 1726882379.04733: Calling all_inventory to load vars for managed_node1 11124 1726882379.04734: Calling groups_inventory to load vars for managed_node1 11124 1726882379.04736: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882379.04742: Calling all_plugins_play to load vars for managed_node1 11124 1726882379.04744: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882379.04746: Calling groups_plugins_play to load vars for managed_node1 11124 1726882379.05534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882379.06941: done with get_vars() 11124 1726882379.06959: done getting variables 11124 1726882379.07001: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:32:59 -0400 (0:00:00.045) 0:00:19.312 ****** 11124 1726882379.07024: entering _queue_task() for managed_node1/debug 11124 1726882379.07240: worker is 1 (out of 1 available) 11124 1726882379.07256: exiting _queue_task() for managed_node1/debug 11124 1726882379.07270: done queuing things up, now waiting for results queue to drain 11124 1726882379.07272: waiting for pending results... 11124 1726882379.07445: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11124 1726882379.07541: in run() - task 0e448fcc-3ce9-8362-0f62-00000000003a 11124 1726882379.07556: variable 'ansible_search_path' from source: unknown 11124 1726882379.07559: variable 'ansible_search_path' from source: unknown 11124 1726882379.07595: calling self._execute() 11124 1726882379.07661: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.07667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.07676: variable 'omit' from source: magic vars 11124 1726882379.07937: variable 'ansible_distribution_major_version' from source: facts 11124 1726882379.07947: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882379.08033: variable 'network_state' from source: role '' defaults 11124 1726882379.08039: Evaluated conditional (network_state != {}): False 11124 1726882379.08044: when evaluation is False, skipping this task 11124 1726882379.08047: _execute() done 11124 1726882379.08052: dumping result to json 11124 1726882379.08055: done dumping result, returning 11124 1726882379.08061: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-8362-0f62-00000000003a] 11124 1726882379.08068: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000003a 11124 1726882379.08145: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000003a 11124 1726882379.08148: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 11124 1726882379.08195: no more pending results, returning what we have 11124 1726882379.08199: results queue empty 11124 1726882379.08200: checking for any_errors_fatal 11124 1726882379.08206: done checking for any_errors_fatal 11124 1726882379.08207: checking for max_fail_percentage 11124 1726882379.08208: done checking for max_fail_percentage 11124 1726882379.08209: checking to see if all hosts have failed and the running result is not ok 11124 1726882379.08210: done checking to see if all hosts have failed 11124 1726882379.08211: getting the remaining hosts for this loop 11124 1726882379.08212: done getting the remaining hosts for this loop 11124 1726882379.08216: getting the next task for host managed_node1 11124 1726882379.08221: done getting next task for host managed_node1 11124 1726882379.08226: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11124 1726882379.08229: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882379.08241: getting variables 11124 1726882379.08243: in VariableManager get_vars() 11124 1726882379.08281: Calling all_inventory to load vars for managed_node1 11124 1726882379.08284: Calling groups_inventory to load vars for managed_node1 11124 1726882379.08286: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882379.08295: Calling all_plugins_play to load vars for managed_node1 11124 1726882379.08297: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882379.08299: Calling groups_plugins_play to load vars for managed_node1 11124 1726882379.09162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882379.10088: done with get_vars() 11124 1726882379.10102: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:32:59 -0400 (0:00:00.031) 0:00:19.344 ****** 11124 1726882379.10167: entering _queue_task() for managed_node1/ping 11124 1726882379.10169: Creating lock for ping 11124 1726882379.10358: worker is 1 (out of 1 available) 11124 1726882379.10374: exiting _queue_task() for managed_node1/ping 11124 1726882379.10384: done queuing things up, now waiting for results queue to drain 11124 1726882379.10386: waiting for pending results... 11124 1726882379.10544: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 11124 1726882379.10624: in run() - task 0e448fcc-3ce9-8362-0f62-00000000003b 11124 1726882379.10635: variable 'ansible_search_path' from source: unknown 11124 1726882379.10639: variable 'ansible_search_path' from source: unknown 11124 1726882379.10668: calling self._execute() 11124 1726882379.10733: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.10736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.10744: variable 'omit' from source: magic vars 11124 1726882379.10996: variable 'ansible_distribution_major_version' from source: facts 11124 1726882379.11004: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882379.11011: variable 'omit' from source: magic vars 11124 1726882379.11048: variable 'omit' from source: magic vars 11124 1726882379.11075: variable 'omit' from source: magic vars 11124 1726882379.11104: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882379.11127: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882379.11142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882379.11157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882379.11168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882379.11190: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882379.11193: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.11196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.11261: Set connection var ansible_shell_executable to /bin/sh 11124 1726882379.11273: Set connection var ansible_shell_type to sh 11124 1726882379.11279: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882379.11284: Set connection var ansible_timeout to 10 11124 1726882379.11289: Set connection var ansible_pipelining to False 11124 1726882379.11291: Set connection var ansible_connection to ssh 11124 1726882379.11307: variable 'ansible_shell_executable' from source: unknown 11124 1726882379.11310: variable 'ansible_connection' from source: unknown 11124 1726882379.11313: variable 'ansible_module_compression' from source: unknown 11124 1726882379.11316: variable 'ansible_shell_type' from source: unknown 11124 1726882379.11318: variable 'ansible_shell_executable' from source: unknown 11124 1726882379.11320: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.11323: variable 'ansible_pipelining' from source: unknown 11124 1726882379.11325: variable 'ansible_timeout' from source: unknown 11124 1726882379.11329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.11475: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882379.11482: variable 'omit' from source: magic vars 11124 1726882379.11492: starting attempt loop 11124 1726882379.11495: running the handler 11124 1726882379.11505: _low_level_execute_command(): starting 11124 1726882379.11512: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882379.12023: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.12047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882379.12060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.12074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.12119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.12128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882379.12149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.12246: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.13939: stdout chunk (state=3): >>>/root <<< 11124 1726882379.14043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882379.14094: stderr chunk (state=3): >>><<< 11124 1726882379.14097: stdout chunk (state=3): >>><<< 11124 1726882379.14114: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882379.14125: _low_level_execute_command(): starting 11124 1726882379.14130: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930 `" && echo ansible-tmp-1726882379.1411402-12044-55774815212930="` echo /root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930 `" ) && sleep 0' 11124 1726882379.14541: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.14558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.14572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882379.14585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882379.14601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.14642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.14656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.14756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.16637: stdout chunk (state=3): >>>ansible-tmp-1726882379.1411402-12044-55774815212930=/root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930 <<< 11124 1726882379.16760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882379.16802: stderr chunk (state=3): >>><<< 11124 1726882379.16805: stdout chunk (state=3): >>><<< 11124 1726882379.16817: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882379.1411402-12044-55774815212930=/root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882379.16848: variable 'ansible_module_compression' from source: unknown 11124 1726882379.16887: ANSIBALLZ: Using lock for ping 11124 1726882379.16890: ANSIBALLZ: Acquiring lock 11124 1726882379.16892: ANSIBALLZ: Lock acquired: 139628947686752 11124 1726882379.16894: ANSIBALLZ: Creating module 11124 1726882379.25312: ANSIBALLZ: Writing module into payload 11124 1726882379.25359: ANSIBALLZ: Writing module 11124 1726882379.25392: ANSIBALLZ: Renaming module 11124 1726882379.25396: ANSIBALLZ: Done creating module 11124 1726882379.25407: variable 'ansible_facts' from source: unknown 11124 1726882379.25451: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930/AnsiballZ_ping.py 11124 1726882379.25574: Sending initial data 11124 1726882379.25578: Sent initial data (152 bytes) 11124 1726882379.26396: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882379.26409: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.26420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.26434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.26483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.26491: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882379.26501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.26519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882379.26527: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882379.26533: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882379.26541: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.26551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.26569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.26584: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.26591: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882379.26600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.26680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.26706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882379.26717: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.26849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.28705: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882379.28794: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882379.28899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpslb4d8cg /root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930/AnsiballZ_ping.py <<< 11124 1726882379.28998: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882379.30208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882379.30383: stderr chunk (state=3): >>><<< 11124 1726882379.30393: stdout chunk (state=3): >>><<< 11124 1726882379.30512: done transferring module to remote 11124 1726882379.30515: _low_level_execute_command(): starting 11124 1726882379.30518: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930/ /root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930/AnsiballZ_ping.py && sleep 0' 11124 1726882379.31088: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882379.31102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.31117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.31134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.31185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.31196: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882379.31210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.31227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882379.31238: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882379.31248: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882379.31266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.31288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.31303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.31315: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.31325: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882379.31338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.31416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.31438: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882379.31456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.31587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.33415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882379.33511: stderr chunk (state=3): >>><<< 11124 1726882379.33522: stdout chunk (state=3): >>><<< 11124 1726882379.33633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882379.33636: _low_level_execute_command(): starting 11124 1726882379.33639: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930/AnsiballZ_ping.py && sleep 0' 11124 1726882379.34231: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882379.34245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.34265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.34289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.34334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.34346: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882379.34365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.34384: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882379.34403: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882379.34417: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882379.34430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.34445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.34467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.34481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.34493: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882379.34512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.34591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.34612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882379.34636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.34777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.47639: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11124 1726882379.48653: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882379.48702: stderr chunk (state=3): >>><<< 11124 1726882379.48705: stdout chunk (state=3): >>><<< 11124 1726882379.48718: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882379.48740: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882379.48747: _low_level_execute_command(): starting 11124 1726882379.48754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882379.1411402-12044-55774815212930/ > /dev/null 2>&1 && sleep 0' 11124 1726882379.49169: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.49175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.49210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.49217: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.49234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.49237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.49294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.49296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.49402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.51237: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882379.51320: stderr chunk (state=3): >>><<< 11124 1726882379.51330: stdout chunk (state=3): >>><<< 11124 1726882379.51473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882379.51477: handler run complete 11124 1726882379.51479: attempt loop complete, returning result 11124 1726882379.51481: _execute() done 11124 1726882379.51483: dumping result to json 11124 1726882379.51485: done dumping result, returning 11124 1726882379.51487: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-8362-0f62-00000000003b] 11124 1726882379.51489: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000003b 11124 1726882379.51562: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000003b 11124 1726882379.51567: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 11124 1726882379.51919: no more pending results, returning what we have 11124 1726882379.51923: results queue empty 11124 1726882379.51923: checking for any_errors_fatal 11124 1726882379.51928: done checking for any_errors_fatal 11124 1726882379.51929: checking for max_fail_percentage 11124 1726882379.51931: done checking for max_fail_percentage 11124 1726882379.51932: checking to see if all hosts have failed and the running result is not ok 11124 1726882379.51933: done checking to see if all hosts have failed 11124 1726882379.51934: getting the remaining hosts for this loop 11124 1726882379.51935: done getting the remaining hosts for this loop 11124 1726882379.51938: getting the next task for host managed_node1 11124 1726882379.51947: done getting next task for host managed_node1 11124 1726882379.51952: ^ task is: TASK: meta (role_complete) 11124 1726882379.51955: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882379.51968: getting variables 11124 1726882379.51969: in VariableManager get_vars() 11124 1726882379.52010: Calling all_inventory to load vars for managed_node1 11124 1726882379.52013: Calling groups_inventory to load vars for managed_node1 11124 1726882379.52015: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882379.52025: Calling all_plugins_play to load vars for managed_node1 11124 1726882379.52027: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882379.52030: Calling groups_plugins_play to load vars for managed_node1 11124 1726882379.53557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882379.55497: done with get_vars() 11124 1726882379.55523: done getting variables 11124 1726882379.55609: done queuing things up, now waiting for results queue to drain 11124 1726882379.55611: results queue empty 11124 1726882379.55612: checking for any_errors_fatal 11124 1726882379.55614: done checking for any_errors_fatal 11124 1726882379.55615: checking for max_fail_percentage 11124 1726882379.55616: done checking for max_fail_percentage 11124 1726882379.55617: checking to see if all hosts have failed and the running result is not ok 11124 1726882379.55617: done checking to see if all hosts have failed 11124 1726882379.55618: getting the remaining hosts for this loop 11124 1726882379.55619: done getting the remaining hosts for this loop 11124 1726882379.55622: getting the next task for host managed_node1 11124 1726882379.55632: done getting next task for host managed_node1 11124 1726882379.55634: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11124 1726882379.55636: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882379.55638: getting variables 11124 1726882379.55639: in VariableManager get_vars() 11124 1726882379.55656: Calling all_inventory to load vars for managed_node1 11124 1726882379.55658: Calling groups_inventory to load vars for managed_node1 11124 1726882379.55660: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882379.55667: Calling all_plugins_play to load vars for managed_node1 11124 1726882379.55670: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882379.55673: Calling groups_plugins_play to load vars for managed_node1 11124 1726882379.57046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882379.60099: done with get_vars() 11124 1726882379.60127: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:32:59 -0400 (0:00:00.500) 0:00:19.844 ****** 11124 1726882379.60215: entering _queue_task() for managed_node1/include_tasks 11124 1726882379.60576: worker is 1 (out of 1 available) 11124 1726882379.60589: exiting _queue_task() for managed_node1/include_tasks 11124 1726882379.60602: done queuing things up, now waiting for results queue to drain 11124 1726882379.60604: waiting for pending results... 11124 1726882379.60984: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11124 1726882379.61121: in run() - task 0e448fcc-3ce9-8362-0f62-00000000006e 11124 1726882379.61140: variable 'ansible_search_path' from source: unknown 11124 1726882379.61146: variable 'ansible_search_path' from source: unknown 11124 1726882379.61197: calling self._execute() 11124 1726882379.61295: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.61304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.61315: variable 'omit' from source: magic vars 11124 1726882379.62024: variable 'ansible_distribution_major_version' from source: facts 11124 1726882379.62065: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882379.62166: _execute() done 11124 1726882379.62182: dumping result to json 11124 1726882379.62190: done dumping result, returning 11124 1726882379.62200: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0e448fcc-3ce9-8362-0f62-00000000006e] 11124 1726882379.62211: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000006e 11124 1726882379.62343: no more pending results, returning what we have 11124 1726882379.62348: in VariableManager get_vars() 11124 1726882379.62403: Calling all_inventory to load vars for managed_node1 11124 1726882379.62407: Calling groups_inventory to load vars for managed_node1 11124 1726882379.62409: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882379.62423: Calling all_plugins_play to load vars for managed_node1 11124 1726882379.62426: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882379.62429: Calling groups_plugins_play to load vars for managed_node1 11124 1726882379.63531: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000006e 11124 1726882379.63535: WORKER PROCESS EXITING 11124 1726882379.64580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882379.67679: done with get_vars() 11124 1726882379.67706: variable 'ansible_search_path' from source: unknown 11124 1726882379.67708: variable 'ansible_search_path' from source: unknown 11124 1726882379.67747: we have included files to process 11124 1726882379.67749: generating all_blocks data 11124 1726882379.67751: done generating all_blocks data 11124 1726882379.67756: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11124 1726882379.67757: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11124 1726882379.67760: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11124 1726882379.67974: done processing included file 11124 1726882379.67977: iterating over new_blocks loaded from include file 11124 1726882379.67979: in VariableManager get_vars() 11124 1726882379.67999: done with get_vars() 11124 1726882379.68001: filtering new block on tags 11124 1726882379.68019: done filtering new block on tags 11124 1726882379.68021: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11124 1726882379.68027: extending task lists for all hosts with included blocks 11124 1726882379.68131: done extending task lists 11124 1726882379.68132: done processing included files 11124 1726882379.68133: results queue empty 11124 1726882379.68134: checking for any_errors_fatal 11124 1726882379.68135: done checking for any_errors_fatal 11124 1726882379.68136: checking for max_fail_percentage 11124 1726882379.68137: done checking for max_fail_percentage 11124 1726882379.68138: checking to see if all hosts have failed and the running result is not ok 11124 1726882379.68139: done checking to see if all hosts have failed 11124 1726882379.68140: getting the remaining hosts for this loop 11124 1726882379.68141: done getting the remaining hosts for this loop 11124 1726882379.68143: getting the next task for host managed_node1 11124 1726882379.68147: done getting next task for host managed_node1 11124 1726882379.68149: ^ task is: TASK: Get stat for interface {{ interface }} 11124 1726882379.68152: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882379.68154: getting variables 11124 1726882379.68156: in VariableManager get_vars() 11124 1726882379.68174: Calling all_inventory to load vars for managed_node1 11124 1726882379.68177: Calling groups_inventory to load vars for managed_node1 11124 1726882379.68179: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882379.68185: Calling all_plugins_play to load vars for managed_node1 11124 1726882379.68187: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882379.68190: Calling groups_plugins_play to load vars for managed_node1 11124 1726882379.69611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882379.71426: done with get_vars() 11124 1726882379.71450: done getting variables 11124 1726882379.71612: variable 'interface' from source: task vars 11124 1726882379.71616: variable 'controller_device' from source: play vars 11124 1726882379.71678: variable 'controller_device' from source: play vars TASK [Get stat for interface deprecated-bond] ********************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:32:59 -0400 (0:00:00.114) 0:00:19.959 ****** 11124 1726882379.71711: entering _queue_task() for managed_node1/stat 11124 1726882379.72055: worker is 1 (out of 1 available) 11124 1726882379.72068: exiting _queue_task() for managed_node1/stat 11124 1726882379.72080: done queuing things up, now waiting for results queue to drain 11124 1726882379.72082: waiting for pending results... 11124 1726882379.72361: running TaskExecutor() for managed_node1/TASK: Get stat for interface deprecated-bond 11124 1726882379.72482: in run() - task 0e448fcc-3ce9-8362-0f62-000000000242 11124 1726882379.72493: variable 'ansible_search_path' from source: unknown 11124 1726882379.72497: variable 'ansible_search_path' from source: unknown 11124 1726882379.72537: calling self._execute() 11124 1726882379.72618: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.72624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.72637: variable 'omit' from source: magic vars 11124 1726882379.73068: variable 'ansible_distribution_major_version' from source: facts 11124 1726882379.73082: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882379.73089: variable 'omit' from source: magic vars 11124 1726882379.73142: variable 'omit' from source: magic vars 11124 1726882379.73287: variable 'interface' from source: task vars 11124 1726882379.73290: variable 'controller_device' from source: play vars 11124 1726882379.73376: variable 'controller_device' from source: play vars 11124 1726882379.73380: variable 'omit' from source: magic vars 11124 1726882379.73436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882379.73471: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882379.73491: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882379.73513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882379.73524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882379.73555: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882379.73558: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.73561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.73668: Set connection var ansible_shell_executable to /bin/sh 11124 1726882379.73675: Set connection var ansible_shell_type to sh 11124 1726882379.73683: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882379.73689: Set connection var ansible_timeout to 10 11124 1726882379.73695: Set connection var ansible_pipelining to False 11124 1726882379.73698: Set connection var ansible_connection to ssh 11124 1726882379.73724: variable 'ansible_shell_executable' from source: unknown 11124 1726882379.73727: variable 'ansible_connection' from source: unknown 11124 1726882379.73729: variable 'ansible_module_compression' from source: unknown 11124 1726882379.73731: variable 'ansible_shell_type' from source: unknown 11124 1726882379.73734: variable 'ansible_shell_executable' from source: unknown 11124 1726882379.73736: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882379.73740: variable 'ansible_pipelining' from source: unknown 11124 1726882379.73743: variable 'ansible_timeout' from source: unknown 11124 1726882379.73747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882379.73952: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882379.73959: variable 'omit' from source: magic vars 11124 1726882379.73967: starting attempt loop 11124 1726882379.73970: running the handler 11124 1726882379.73984: _low_level_execute_command(): starting 11124 1726882379.73992: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882379.74699: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882379.74715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.74725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.74739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.74778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.74786: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882379.74796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.74809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882379.74822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882379.74828: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882379.74836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.74846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.74857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.74866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.74878: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882379.74887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.74961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.74977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882379.74983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.75115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.76785: stdout chunk (state=3): >>>/root <<< 11124 1726882379.76909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882379.76984: stderr chunk (state=3): >>><<< 11124 1726882379.76988: stdout chunk (state=3): >>><<< 11124 1726882379.77011: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882379.77025: _low_level_execute_command(): starting 11124 1726882379.77031: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428 `" && echo ansible-tmp-1726882379.770114-12084-78423171844428="` echo /root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428 `" ) && sleep 0' 11124 1726882379.77648: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882379.77658: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.77677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.77691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.77729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.77738: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882379.77747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.77761: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882379.77774: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882379.77777: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882379.77785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.77794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.77806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.77813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.77820: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882379.77829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.77901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.77914: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882379.77925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.78051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.79938: stdout chunk (state=3): >>>ansible-tmp-1726882379.770114-12084-78423171844428=/root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428 <<< 11124 1726882379.80048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882379.80125: stderr chunk (state=3): >>><<< 11124 1726882379.80128: stdout chunk (state=3): >>><<< 11124 1726882379.80147: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882379.770114-12084-78423171844428=/root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882379.80194: variable 'ansible_module_compression' from source: unknown 11124 1726882379.80255: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11124 1726882379.80290: variable 'ansible_facts' from source: unknown 11124 1726882379.80384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428/AnsiballZ_stat.py 11124 1726882379.80521: Sending initial data 11124 1726882379.80524: Sent initial data (151 bytes) 11124 1726882379.81447: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882379.81460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.81468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.81482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.81520: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.81527: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882379.81537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.81553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882379.81556: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882379.81567: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882379.81575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.81585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.81596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.81603: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.81610: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882379.81619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.81692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.81706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882379.81716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.81836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.83622: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882379.83708: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882379.83800: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpoytvyhdp /root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428/AnsiballZ_stat.py <<< 11124 1726882379.83890: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882379.85233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882379.85320: stderr chunk (state=3): >>><<< 11124 1726882379.85323: stdout chunk (state=3): >>><<< 11124 1726882379.85347: done transferring module to remote 11124 1726882379.85359: _low_level_execute_command(): starting 11124 1726882379.85367: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428/ /root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428/AnsiballZ_stat.py && sleep 0' 11124 1726882379.86028: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882379.86037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.86047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.86061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.86109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.86116: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882379.86126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.86140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882379.86147: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882379.86153: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882379.86162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.86175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.86187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.86196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.86210: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882379.86220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.86292: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.86311: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882379.86328: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.86452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882379.88279: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882379.88322: stderr chunk (state=3): >>><<< 11124 1726882379.88325: stdout chunk (state=3): >>><<< 11124 1726882379.88342: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882379.88345: _low_level_execute_command(): starting 11124 1726882379.88353: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428/AnsiballZ_stat.py && sleep 0' 11124 1726882379.88945: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882379.88954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.88963: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.88978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.89014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.89021: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882379.89031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.89044: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882379.89053: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882379.89058: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882379.89063: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882379.89077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882379.89089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882379.89096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882379.89102: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882379.89111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882379.89183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882379.89196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882379.89206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882379.89330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.02574: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26141, "dev": 21, "nlink": 1, "atime": 1726882378.7168226, "mtime": 1726882378.7168226, "ctime": 1726882378.7168226, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11124 1726882380.03567: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882380.03585: stderr chunk (state=3): >>>Shared connection to 10.31.44.90 closed. <<< 11124 1726882380.03680: stderr chunk (state=3): >>><<< 11124 1726882380.03695: stdout chunk (state=3): >>><<< 11124 1726882380.03876: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26141, "dev": 21, "nlink": 1, "atime": 1726882378.7168226, "mtime": 1726882378.7168226, "ctime": 1726882378.7168226, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882380.03886: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882380.03889: _low_level_execute_command(): starting 11124 1726882380.03892: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882379.770114-12084-78423171844428/ > /dev/null 2>&1 && sleep 0' 11124 1726882380.04483: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882380.04498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.04513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.04531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.04585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.04598: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882380.04612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.04630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882380.04642: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882380.04665: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882380.04680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.04694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.04709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.04722: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.04732: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882380.04746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.04829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882380.04854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882380.04876: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882380.05011: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.06923: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882380.06927: stdout chunk (state=3): >>><<< 11124 1726882380.06929: stderr chunk (state=3): >>><<< 11124 1726882380.07401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882380.07405: handler run complete 11124 1726882380.07407: attempt loop complete, returning result 11124 1726882380.07409: _execute() done 11124 1726882380.07411: dumping result to json 11124 1726882380.07413: done dumping result, returning 11124 1726882380.07415: done running TaskExecutor() for managed_node1/TASK: Get stat for interface deprecated-bond [0e448fcc-3ce9-8362-0f62-000000000242] 11124 1726882380.07417: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000242 11124 1726882380.07496: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000242 11124 1726882380.07500: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882378.7168226, "block_size": 4096, "blocks": 0, "ctime": 1726882378.7168226, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26141, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "mode": "0777", "mtime": 1726882378.7168226, "nlink": 1, "path": "/sys/class/net/deprecated-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11124 1726882380.07593: no more pending results, returning what we have 11124 1726882380.07596: results queue empty 11124 1726882380.07597: checking for any_errors_fatal 11124 1726882380.07599: done checking for any_errors_fatal 11124 1726882380.07599: checking for max_fail_percentage 11124 1726882380.07601: done checking for max_fail_percentage 11124 1726882380.07602: checking to see if all hosts have failed and the running result is not ok 11124 1726882380.07603: done checking to see if all hosts have failed 11124 1726882380.07604: getting the remaining hosts for this loop 11124 1726882380.07605: done getting the remaining hosts for this loop 11124 1726882380.07610: getting the next task for host managed_node1 11124 1726882380.07618: done getting next task for host managed_node1 11124 1726882380.07620: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11124 1726882380.07623: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882380.07627: getting variables 11124 1726882380.07629: in VariableManager get_vars() 11124 1726882380.07669: Calling all_inventory to load vars for managed_node1 11124 1726882380.07676: Calling groups_inventory to load vars for managed_node1 11124 1726882380.07679: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882380.07689: Calling all_plugins_play to load vars for managed_node1 11124 1726882380.07692: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882380.07695: Calling groups_plugins_play to load vars for managed_node1 11124 1726882380.09195: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882380.11606: done with get_vars() 11124 1726882380.11636: done getting variables 11124 1726882380.11700: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882380.11834: variable 'interface' from source: task vars 11124 1726882380.11838: variable 'controller_device' from source: play vars 11124 1726882380.11898: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'deprecated-bond'] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:33:00 -0400 (0:00:00.402) 0:00:20.361 ****** 11124 1726882380.11936: entering _queue_task() for managed_node1/assert 11124 1726882380.12273: worker is 1 (out of 1 available) 11124 1726882380.12285: exiting _queue_task() for managed_node1/assert 11124 1726882380.12298: done queuing things up, now waiting for results queue to drain 11124 1726882380.12300: waiting for pending results... 11124 1726882380.12597: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'deprecated-bond' 11124 1726882380.12748: in run() - task 0e448fcc-3ce9-8362-0f62-00000000006f 11124 1726882380.12769: variable 'ansible_search_path' from source: unknown 11124 1726882380.12777: variable 'ansible_search_path' from source: unknown 11124 1726882380.12814: calling self._execute() 11124 1726882380.12912: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.12922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.12941: variable 'omit' from source: magic vars 11124 1726882380.13333: variable 'ansible_distribution_major_version' from source: facts 11124 1726882380.13351: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882380.13366: variable 'omit' from source: magic vars 11124 1726882380.13426: variable 'omit' from source: magic vars 11124 1726882380.13534: variable 'interface' from source: task vars 11124 1726882380.13544: variable 'controller_device' from source: play vars 11124 1726882380.13620: variable 'controller_device' from source: play vars 11124 1726882380.13644: variable 'omit' from source: magic vars 11124 1726882380.13709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882380.13751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882380.13778: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882380.13802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882380.13821: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882380.13862: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882380.13875: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.13883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.13996: Set connection var ansible_shell_executable to /bin/sh 11124 1726882380.14011: Set connection var ansible_shell_type to sh 11124 1726882380.14029: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882380.14042: Set connection var ansible_timeout to 10 11124 1726882380.14055: Set connection var ansible_pipelining to False 11124 1726882380.14065: Set connection var ansible_connection to ssh 11124 1726882380.14092: variable 'ansible_shell_executable' from source: unknown 11124 1726882380.14100: variable 'ansible_connection' from source: unknown 11124 1726882380.14107: variable 'ansible_module_compression' from source: unknown 11124 1726882380.14113: variable 'ansible_shell_type' from source: unknown 11124 1726882380.14121: variable 'ansible_shell_executable' from source: unknown 11124 1726882380.14128: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.14141: variable 'ansible_pipelining' from source: unknown 11124 1726882380.14149: variable 'ansible_timeout' from source: unknown 11124 1726882380.14162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.14312: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882380.14328: variable 'omit' from source: magic vars 11124 1726882380.14339: starting attempt loop 11124 1726882380.14346: running the handler 11124 1726882380.14495: variable 'interface_stat' from source: set_fact 11124 1726882380.14523: Evaluated conditional (interface_stat.stat.exists): True 11124 1726882380.14533: handler run complete 11124 1726882380.14568: attempt loop complete, returning result 11124 1726882380.14582: _execute() done 11124 1726882380.14594: dumping result to json 11124 1726882380.14602: done dumping result, returning 11124 1726882380.14614: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'deprecated-bond' [0e448fcc-3ce9-8362-0f62-00000000006f] 11124 1726882380.14623: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000006f ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882380.14781: no more pending results, returning what we have 11124 1726882380.14784: results queue empty 11124 1726882380.14786: checking for any_errors_fatal 11124 1726882380.14795: done checking for any_errors_fatal 11124 1726882380.14796: checking for max_fail_percentage 11124 1726882380.14798: done checking for max_fail_percentage 11124 1726882380.14799: checking to see if all hosts have failed and the running result is not ok 11124 1726882380.14800: done checking to see if all hosts have failed 11124 1726882380.14801: getting the remaining hosts for this loop 11124 1726882380.14802: done getting the remaining hosts for this loop 11124 1726882380.14806: getting the next task for host managed_node1 11124 1726882380.14814: done getting next task for host managed_node1 11124 1726882380.14817: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11124 1726882380.14820: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882380.14823: getting variables 11124 1726882380.14825: in VariableManager get_vars() 11124 1726882380.14870: Calling all_inventory to load vars for managed_node1 11124 1726882380.14873: Calling groups_inventory to load vars for managed_node1 11124 1726882380.14876: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882380.14887: Calling all_plugins_play to load vars for managed_node1 11124 1726882380.14890: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882380.14893: Calling groups_plugins_play to load vars for managed_node1 11124 1726882380.15905: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000006f 11124 1726882380.15909: WORKER PROCESS EXITING 11124 1726882380.16692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882380.22424: done with get_vars() 11124 1726882380.22461: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:67 Friday 20 September 2024 21:33:00 -0400 (0:00:00.106) 0:00:20.467 ****** 11124 1726882380.22546: entering _queue_task() for managed_node1/include_tasks 11124 1726882380.22892: worker is 1 (out of 1 available) 11124 1726882380.22904: exiting _queue_task() for managed_node1/include_tasks 11124 1726882380.22916: done queuing things up, now waiting for results queue to drain 11124 1726882380.22918: waiting for pending results... 11124 1726882380.23190: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 11124 1726882380.23300: in run() - task 0e448fcc-3ce9-8362-0f62-000000000070 11124 1726882380.23322: variable 'ansible_search_path' from source: unknown 11124 1726882380.23382: variable 'controller_profile' from source: play vars 11124 1726882380.23589: variable 'controller_profile' from source: play vars 11124 1726882380.23606: variable 'port1_profile' from source: play vars 11124 1726882380.23678: variable 'port1_profile' from source: play vars 11124 1726882380.23691: variable 'port2_profile' from source: play vars 11124 1726882380.23753: variable 'port2_profile' from source: play vars 11124 1726882380.23771: variable 'omit' from source: magic vars 11124 1726882380.23907: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.23922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.23937: variable 'omit' from source: magic vars 11124 1726882380.24162: variable 'ansible_distribution_major_version' from source: facts 11124 1726882380.24236: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882380.24269: variable 'item' from source: unknown 11124 1726882380.24390: variable 'item' from source: unknown 11124 1726882380.24731: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.24742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.24754: variable 'omit' from source: magic vars 11124 1726882380.24919: variable 'ansible_distribution_major_version' from source: facts 11124 1726882380.24929: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882380.24958: variable 'item' from source: unknown 11124 1726882380.25028: variable 'item' from source: unknown 11124 1726882380.25163: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.25180: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.25194: variable 'omit' from source: magic vars 11124 1726882380.25361: variable 'ansible_distribution_major_version' from source: facts 11124 1726882380.25374: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882380.25401: variable 'item' from source: unknown 11124 1726882380.25474: variable 'item' from source: unknown 11124 1726882380.25561: dumping result to json 11124 1726882380.25574: done dumping result, returning 11124 1726882380.25586: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [0e448fcc-3ce9-8362-0f62-000000000070] 11124 1726882380.25597: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000070 11124 1726882380.25659: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000070 11124 1726882380.25672: WORKER PROCESS EXITING 11124 1726882380.25714: no more pending results, returning what we have 11124 1726882380.25718: in VariableManager get_vars() 11124 1726882380.25763: Calling all_inventory to load vars for managed_node1 11124 1726882380.25767: Calling groups_inventory to load vars for managed_node1 11124 1726882380.25769: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882380.25783: Calling all_plugins_play to load vars for managed_node1 11124 1726882380.25786: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882380.25789: Calling groups_plugins_play to load vars for managed_node1 11124 1726882380.26633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882380.27598: done with get_vars() 11124 1726882380.27613: variable 'ansible_search_path' from source: unknown 11124 1726882380.27637: variable 'ansible_search_path' from source: unknown 11124 1726882380.27649: variable 'ansible_search_path' from source: unknown 11124 1726882380.27657: we have included files to process 11124 1726882380.27658: generating all_blocks data 11124 1726882380.27659: done generating all_blocks data 11124 1726882380.27665: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11124 1726882380.27666: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11124 1726882380.27668: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11124 1726882380.27855: in VariableManager get_vars() 11124 1726882380.27881: done with get_vars() 11124 1726882380.28136: done processing included file 11124 1726882380.28138: iterating over new_blocks loaded from include file 11124 1726882380.28139: in VariableManager get_vars() 11124 1726882380.28157: done with get_vars() 11124 1726882380.28159: filtering new block on tags 11124 1726882380.28182: done filtering new block on tags 11124 1726882380.28185: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0) 11124 1726882380.28190: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11124 1726882380.28191: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11124 1726882380.28193: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11124 1726882380.28289: in VariableManager get_vars() 11124 1726882380.28310: done with get_vars() 11124 1726882380.28523: done processing included file 11124 1726882380.28525: iterating over new_blocks loaded from include file 11124 1726882380.28526: in VariableManager get_vars() 11124 1726882380.28543: done with get_vars() 11124 1726882380.28545: filtering new block on tags 11124 1726882380.28566: done filtering new block on tags 11124 1726882380.28569: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.0) 11124 1726882380.28573: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11124 1726882380.28574: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11124 1726882380.28577: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11124 1726882380.28737: in VariableManager get_vars() 11124 1726882380.28758: done with get_vars() 11124 1726882380.28983: done processing included file 11124 1726882380.28985: iterating over new_blocks loaded from include file 11124 1726882380.28986: in VariableManager get_vars() 11124 1726882380.29002: done with get_vars() 11124 1726882380.29003: filtering new block on tags 11124 1726882380.29018: done filtering new block on tags 11124 1726882380.29020: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.1) 11124 1726882380.29022: extending task lists for all hosts with included blocks 11124 1726882380.30675: done extending task lists 11124 1726882380.30681: done processing included files 11124 1726882380.30682: results queue empty 11124 1726882380.30682: checking for any_errors_fatal 11124 1726882380.30686: done checking for any_errors_fatal 11124 1726882380.30686: checking for max_fail_percentage 11124 1726882380.30687: done checking for max_fail_percentage 11124 1726882380.30687: checking to see if all hosts have failed and the running result is not ok 11124 1726882380.30688: done checking to see if all hosts have failed 11124 1726882380.30688: getting the remaining hosts for this loop 11124 1726882380.30689: done getting the remaining hosts for this loop 11124 1726882380.30691: getting the next task for host managed_node1 11124 1726882380.30694: done getting next task for host managed_node1 11124 1726882380.30695: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11124 1726882380.30697: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882380.30698: getting variables 11124 1726882380.30699: in VariableManager get_vars() 11124 1726882380.30710: Calling all_inventory to load vars for managed_node1 11124 1726882380.30712: Calling groups_inventory to load vars for managed_node1 11124 1726882380.30713: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882380.30718: Calling all_plugins_play to load vars for managed_node1 11124 1726882380.30720: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882380.30721: Calling groups_plugins_play to load vars for managed_node1 11124 1726882380.31634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882380.33321: done with get_vars() 11124 1726882380.33346: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:00 -0400 (0:00:00.108) 0:00:20.576 ****** 11124 1726882380.33429: entering _queue_task() for managed_node1/include_tasks 11124 1726882380.33772: worker is 1 (out of 1 available) 11124 1726882380.33785: exiting _queue_task() for managed_node1/include_tasks 11124 1726882380.33798: done queuing things up, now waiting for results queue to drain 11124 1726882380.33799: waiting for pending results... 11124 1726882380.34077: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11124 1726882380.34186: in run() - task 0e448fcc-3ce9-8362-0f62-000000000260 11124 1726882380.34207: variable 'ansible_search_path' from source: unknown 11124 1726882380.34214: variable 'ansible_search_path' from source: unknown 11124 1726882380.34261: calling self._execute() 11124 1726882380.34359: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.34373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.34387: variable 'omit' from source: magic vars 11124 1726882380.34802: variable 'ansible_distribution_major_version' from source: facts 11124 1726882380.34821: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882380.34832: _execute() done 11124 1726882380.34840: dumping result to json 11124 1726882380.34848: done dumping result, returning 11124 1726882380.34858: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-8362-0f62-000000000260] 11124 1726882380.34871: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000260 11124 1726882380.34987: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000260 11124 1726882380.34995: WORKER PROCESS EXITING 11124 1726882380.35032: no more pending results, returning what we have 11124 1726882380.35038: in VariableManager get_vars() 11124 1726882380.35094: Calling all_inventory to load vars for managed_node1 11124 1726882380.35098: Calling groups_inventory to load vars for managed_node1 11124 1726882380.35100: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882380.35117: Calling all_plugins_play to load vars for managed_node1 11124 1726882380.35120: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882380.35124: Calling groups_plugins_play to load vars for managed_node1 11124 1726882380.36866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882380.38641: done with get_vars() 11124 1726882380.38671: variable 'ansible_search_path' from source: unknown 11124 1726882380.38672: variable 'ansible_search_path' from source: unknown 11124 1726882380.38713: we have included files to process 11124 1726882380.38715: generating all_blocks data 11124 1726882380.38717: done generating all_blocks data 11124 1726882380.38718: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11124 1726882380.38719: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11124 1726882380.38721: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11124 1726882380.39807: done processing included file 11124 1726882380.39815: iterating over new_blocks loaded from include file 11124 1726882380.39817: in VariableManager get_vars() 11124 1726882380.39851: done with get_vars() 11124 1726882380.39854: filtering new block on tags 11124 1726882380.39882: done filtering new block on tags 11124 1726882380.39885: in VariableManager get_vars() 11124 1726882380.39904: done with get_vars() 11124 1726882380.39906: filtering new block on tags 11124 1726882380.39928: done filtering new block on tags 11124 1726882380.39930: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11124 1726882380.39935: extending task lists for all hosts with included blocks 11124 1726882380.40391: done extending task lists 11124 1726882380.40393: done processing included files 11124 1726882380.40394: results queue empty 11124 1726882380.40395: checking for any_errors_fatal 11124 1726882380.40398: done checking for any_errors_fatal 11124 1726882380.40399: checking for max_fail_percentage 11124 1726882380.40400: done checking for max_fail_percentage 11124 1726882380.40401: checking to see if all hosts have failed and the running result is not ok 11124 1726882380.40402: done checking to see if all hosts have failed 11124 1726882380.40403: getting the remaining hosts for this loop 11124 1726882380.40404: done getting the remaining hosts for this loop 11124 1726882380.40406: getting the next task for host managed_node1 11124 1726882380.40410: done getting next task for host managed_node1 11124 1726882380.40413: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11124 1726882380.40417: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882380.40419: getting variables 11124 1726882380.40420: in VariableManager get_vars() 11124 1726882380.40503: Calling all_inventory to load vars for managed_node1 11124 1726882380.40506: Calling groups_inventory to load vars for managed_node1 11124 1726882380.40508: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882380.40515: Calling all_plugins_play to load vars for managed_node1 11124 1726882380.40517: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882380.40521: Calling groups_plugins_play to load vars for managed_node1 11124 1726882380.42162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882380.44289: done with get_vars() 11124 1726882380.44313: done getting variables 11124 1726882380.44366: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:00 -0400 (0:00:00.109) 0:00:20.686 ****** 11124 1726882380.44397: entering _queue_task() for managed_node1/set_fact 11124 1726882380.44744: worker is 1 (out of 1 available) 11124 1726882380.44760: exiting _queue_task() for managed_node1/set_fact 11124 1726882380.44778: done queuing things up, now waiting for results queue to drain 11124 1726882380.44780: waiting for pending results... 11124 1726882380.45247: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11124 1726882380.45373: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003b3 11124 1726882380.45393: variable 'ansible_search_path' from source: unknown 11124 1726882380.45401: variable 'ansible_search_path' from source: unknown 11124 1726882380.45445: calling self._execute() 11124 1726882380.45535: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.45552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.45570: variable 'omit' from source: magic vars 11124 1726882380.45944: variable 'ansible_distribution_major_version' from source: facts 11124 1726882380.45966: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882380.45982: variable 'omit' from source: magic vars 11124 1726882380.46032: variable 'omit' from source: magic vars 11124 1726882380.46074: variable 'omit' from source: magic vars 11124 1726882380.46123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882380.46163: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882380.46192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882380.46215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882380.46230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882380.46268: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882380.46278: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.46286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.46393: Set connection var ansible_shell_executable to /bin/sh 11124 1726882380.46407: Set connection var ansible_shell_type to sh 11124 1726882380.46424: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882380.46433: Set connection var ansible_timeout to 10 11124 1726882380.46443: Set connection var ansible_pipelining to False 11124 1726882380.46452: Set connection var ansible_connection to ssh 11124 1726882380.46478: variable 'ansible_shell_executable' from source: unknown 11124 1726882380.46486: variable 'ansible_connection' from source: unknown 11124 1726882380.46493: variable 'ansible_module_compression' from source: unknown 11124 1726882380.46501: variable 'ansible_shell_type' from source: unknown 11124 1726882380.46508: variable 'ansible_shell_executable' from source: unknown 11124 1726882380.46514: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.46525: variable 'ansible_pipelining' from source: unknown 11124 1726882380.46531: variable 'ansible_timeout' from source: unknown 11124 1726882380.46538: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.46699: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882380.46716: variable 'omit' from source: magic vars 11124 1726882380.46726: starting attempt loop 11124 1726882380.46734: running the handler 11124 1726882380.46758: handler run complete 11124 1726882380.46776: attempt loop complete, returning result 11124 1726882380.46783: _execute() done 11124 1726882380.46790: dumping result to json 11124 1726882380.46797: done dumping result, returning 11124 1726882380.46809: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-8362-0f62-0000000003b3] 11124 1726882380.46818: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b3 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11124 1726882380.46971: no more pending results, returning what we have 11124 1726882380.46974: results queue empty 11124 1726882380.46975: checking for any_errors_fatal 11124 1726882380.46977: done checking for any_errors_fatal 11124 1726882380.46978: checking for max_fail_percentage 11124 1726882380.46980: done checking for max_fail_percentage 11124 1726882380.46981: checking to see if all hosts have failed and the running result is not ok 11124 1726882380.46982: done checking to see if all hosts have failed 11124 1726882380.46982: getting the remaining hosts for this loop 11124 1726882380.46984: done getting the remaining hosts for this loop 11124 1726882380.46987: getting the next task for host managed_node1 11124 1726882380.46994: done getting next task for host managed_node1 11124 1726882380.46997: ^ task is: TASK: Stat profile file 11124 1726882380.47002: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882380.47006: getting variables 11124 1726882380.47008: in VariableManager get_vars() 11124 1726882380.47054: Calling all_inventory to load vars for managed_node1 11124 1726882380.47058: Calling groups_inventory to load vars for managed_node1 11124 1726882380.47061: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882380.47076: Calling all_plugins_play to load vars for managed_node1 11124 1726882380.47080: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882380.47083: Calling groups_plugins_play to load vars for managed_node1 11124 1726882380.48084: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b3 11124 1726882380.48088: WORKER PROCESS EXITING 11124 1726882380.48829: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882380.50577: done with get_vars() 11124 1726882380.50600: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:00 -0400 (0:00:00.062) 0:00:20.749 ****** 11124 1726882380.50696: entering _queue_task() for managed_node1/stat 11124 1726882380.51005: worker is 1 (out of 1 available) 11124 1726882380.51017: exiting _queue_task() for managed_node1/stat 11124 1726882380.51028: done queuing things up, now waiting for results queue to drain 11124 1726882380.51030: waiting for pending results... 11124 1726882380.51312: running TaskExecutor() for managed_node1/TASK: Stat profile file 11124 1726882380.51428: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003b4 11124 1726882380.51445: variable 'ansible_search_path' from source: unknown 11124 1726882380.51456: variable 'ansible_search_path' from source: unknown 11124 1726882380.51500: calling self._execute() 11124 1726882380.51598: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.51608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.51620: variable 'omit' from source: magic vars 11124 1726882380.51991: variable 'ansible_distribution_major_version' from source: facts 11124 1726882380.52008: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882380.52023: variable 'omit' from source: magic vars 11124 1726882380.52077: variable 'omit' from source: magic vars 11124 1726882380.52181: variable 'profile' from source: include params 11124 1726882380.52190: variable 'item' from source: include params 11124 1726882380.52267: variable 'item' from source: include params 11124 1726882380.52291: variable 'omit' from source: magic vars 11124 1726882380.52335: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882380.52381: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882380.52405: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882380.52428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882380.52443: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882380.52485: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882380.52495: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.52502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.52611: Set connection var ansible_shell_executable to /bin/sh 11124 1726882380.52624: Set connection var ansible_shell_type to sh 11124 1726882380.52636: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882380.52645: Set connection var ansible_timeout to 10 11124 1726882380.52660: Set connection var ansible_pipelining to False 11124 1726882380.52675: Set connection var ansible_connection to ssh 11124 1726882380.52701: variable 'ansible_shell_executable' from source: unknown 11124 1726882380.52710: variable 'ansible_connection' from source: unknown 11124 1726882380.52717: variable 'ansible_module_compression' from source: unknown 11124 1726882380.52723: variable 'ansible_shell_type' from source: unknown 11124 1726882380.52729: variable 'ansible_shell_executable' from source: unknown 11124 1726882380.52735: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.52742: variable 'ansible_pipelining' from source: unknown 11124 1726882380.52751: variable 'ansible_timeout' from source: unknown 11124 1726882380.52760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.52970: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882380.52985: variable 'omit' from source: magic vars 11124 1726882380.53001: starting attempt loop 11124 1726882380.53008: running the handler 11124 1726882380.53026: _low_level_execute_command(): starting 11124 1726882380.53036: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882380.53817: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882380.53832: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.53847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.53876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.53921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.53934: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882380.53952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.53974: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882380.53990: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882380.54002: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882380.54015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.54030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.54046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.54066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.54079: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882380.54098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.54180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882380.54207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882380.54225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882380.54356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.56034: stdout chunk (state=3): >>>/root <<< 11124 1726882380.56137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882380.56229: stderr chunk (state=3): >>><<< 11124 1726882380.56241: stdout chunk (state=3): >>><<< 11124 1726882380.56373: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882380.56377: _low_level_execute_command(): starting 11124 1726882380.56380: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362 `" && echo ansible-tmp-1726882380.562764-12111-78981801724362="` echo /root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362 `" ) && sleep 0' 11124 1726882380.56995: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882380.57009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.57032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.57054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.57099: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.57111: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882380.57125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.57155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882380.57171: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882380.57183: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882380.57195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.57209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.57225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.57237: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.57259: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882380.57276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.57356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882380.57385: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882380.57402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882380.57525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.59414: stdout chunk (state=3): >>>ansible-tmp-1726882380.562764-12111-78981801724362=/root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362 <<< 11124 1726882380.59527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882380.59621: stderr chunk (state=3): >>><<< 11124 1726882380.59630: stdout chunk (state=3): >>><<< 11124 1726882380.59674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882380.562764-12111-78981801724362=/root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882380.59974: variable 'ansible_module_compression' from source: unknown 11124 1726882380.59977: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11124 1726882380.59979: variable 'ansible_facts' from source: unknown 11124 1726882380.59981: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362/AnsiballZ_stat.py 11124 1726882380.60043: Sending initial data 11124 1726882380.60046: Sent initial data (151 bytes) 11124 1726882380.61045: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882380.61070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.61086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.61103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.61145: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.61160: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882380.61184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.61201: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882380.61212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882380.61224: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882380.61236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.61251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.61271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.61288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.61302: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882380.61315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.61396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882380.61425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882380.61440: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882380.61562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.63497: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882380.63584: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882380.63673: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmppvtxhh89 /root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362/AnsiballZ_stat.py <<< 11124 1726882380.63765: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882380.64876: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882380.65040: stderr chunk (state=3): >>><<< 11124 1726882380.65044: stdout chunk (state=3): >>><<< 11124 1726882380.65047: done transferring module to remote 11124 1726882380.65050: _low_level_execute_command(): starting 11124 1726882380.65061: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362/ /root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362/AnsiballZ_stat.py && sleep 0' 11124 1726882380.65570: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882380.65584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.65598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.65617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.65659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.65675: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882380.65689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.65707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882380.65719: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882380.65729: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882380.65741: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.65754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.65772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.65785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.65796: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882380.65810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.65885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882380.65906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882380.65927: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882380.66062: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.67871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882380.67924: stderr chunk (state=3): >>><<< 11124 1726882380.67928: stdout chunk (state=3): >>><<< 11124 1726882380.67942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882380.67944: _low_level_execute_command(): starting 11124 1726882380.67952: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362/AnsiballZ_stat.py && sleep 0' 11124 1726882380.68398: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.68404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.68437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.68441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.68462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882380.68469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.68520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882380.68524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882380.68529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882380.68641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.81813: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11124 1726882380.82802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882380.82906: stderr chunk (state=3): >>><<< 11124 1726882380.82940: stdout chunk (state=3): >>><<< 11124 1726882380.83000: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882380.83049: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882380.83057: _low_level_execute_command(): starting 11124 1726882380.83060: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882380.562764-12111-78981801724362/ > /dev/null 2>&1 && sleep 0' 11124 1726882380.83788: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.83794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.83835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.83857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.83943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882380.83968: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882380.83984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882380.84103: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.85918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882380.85968: stderr chunk (state=3): >>><<< 11124 1726882380.85971: stdout chunk (state=3): >>><<< 11124 1726882380.85985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882380.85991: handler run complete 11124 1726882380.86006: attempt loop complete, returning result 11124 1726882380.86010: _execute() done 11124 1726882380.86013: dumping result to json 11124 1726882380.86015: done dumping result, returning 11124 1726882380.86023: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-8362-0f62-0000000003b4] 11124 1726882380.86028: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b4 11124 1726882380.86121: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b4 11124 1726882380.86125: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11124 1726882380.86185: no more pending results, returning what we have 11124 1726882380.86189: results queue empty 11124 1726882380.86190: checking for any_errors_fatal 11124 1726882380.86196: done checking for any_errors_fatal 11124 1726882380.86197: checking for max_fail_percentage 11124 1726882380.86198: done checking for max_fail_percentage 11124 1726882380.86199: checking to see if all hosts have failed and the running result is not ok 11124 1726882380.86200: done checking to see if all hosts have failed 11124 1726882380.86201: getting the remaining hosts for this loop 11124 1726882380.86202: done getting the remaining hosts for this loop 11124 1726882380.86206: getting the next task for host managed_node1 11124 1726882380.86212: done getting next task for host managed_node1 11124 1726882380.86214: ^ task is: TASK: Set NM profile exist flag based on the profile files 11124 1726882380.86218: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882380.86222: getting variables 11124 1726882380.86223: in VariableManager get_vars() 11124 1726882380.86267: Calling all_inventory to load vars for managed_node1 11124 1726882380.86270: Calling groups_inventory to load vars for managed_node1 11124 1726882380.86273: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882380.86283: Calling all_plugins_play to load vars for managed_node1 11124 1726882380.86286: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882380.86288: Calling groups_plugins_play to load vars for managed_node1 11124 1726882380.87255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882380.88512: done with get_vars() 11124 1726882380.88533: done getting variables 11124 1726882380.88580: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:00 -0400 (0:00:00.379) 0:00:21.128 ****** 11124 1726882380.88602: entering _queue_task() for managed_node1/set_fact 11124 1726882380.88929: worker is 1 (out of 1 available) 11124 1726882380.88946: exiting _queue_task() for managed_node1/set_fact 11124 1726882380.88963: done queuing things up, now waiting for results queue to drain 11124 1726882380.88966: waiting for pending results... 11124 1726882380.89254: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11124 1726882380.89376: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003b5 11124 1726882380.89391: variable 'ansible_search_path' from source: unknown 11124 1726882380.89395: variable 'ansible_search_path' from source: unknown 11124 1726882380.89419: calling self._execute() 11124 1726882380.89492: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.89496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.89522: variable 'omit' from source: magic vars 11124 1726882380.89876: variable 'ansible_distribution_major_version' from source: facts 11124 1726882380.89886: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882380.90008: variable 'profile_stat' from source: set_fact 11124 1726882380.90024: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882380.90028: when evaluation is False, skipping this task 11124 1726882380.90030: _execute() done 11124 1726882380.90043: dumping result to json 11124 1726882380.90046: done dumping result, returning 11124 1726882380.90055: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-8362-0f62-0000000003b5] 11124 1726882380.90057: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b5 11124 1726882380.90150: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b5 11124 1726882380.90153: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882380.90310: no more pending results, returning what we have 11124 1726882380.90313: results queue empty 11124 1726882380.90314: checking for any_errors_fatal 11124 1726882380.90320: done checking for any_errors_fatal 11124 1726882380.90320: checking for max_fail_percentage 11124 1726882380.90322: done checking for max_fail_percentage 11124 1726882380.90322: checking to see if all hosts have failed and the running result is not ok 11124 1726882380.90323: done checking to see if all hosts have failed 11124 1726882380.90324: getting the remaining hosts for this loop 11124 1726882380.90325: done getting the remaining hosts for this loop 11124 1726882380.90328: getting the next task for host managed_node1 11124 1726882380.90333: done getting next task for host managed_node1 11124 1726882380.90335: ^ task is: TASK: Get NM profile info 11124 1726882380.90339: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882380.90341: getting variables 11124 1726882380.90342: in VariableManager get_vars() 11124 1726882380.90384: Calling all_inventory to load vars for managed_node1 11124 1726882380.90386: Calling groups_inventory to load vars for managed_node1 11124 1726882380.90388: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882380.90395: Calling all_plugins_play to load vars for managed_node1 11124 1726882380.90397: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882380.90399: Calling groups_plugins_play to load vars for managed_node1 11124 1726882380.91219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882380.92358: done with get_vars() 11124 1726882380.92375: done getting variables 11124 1726882380.92416: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:00 -0400 (0:00:00.038) 0:00:21.166 ****** 11124 1726882380.92438: entering _queue_task() for managed_node1/shell 11124 1726882380.92658: worker is 1 (out of 1 available) 11124 1726882380.92679: exiting _queue_task() for managed_node1/shell 11124 1726882380.92691: done queuing things up, now waiting for results queue to drain 11124 1726882380.92692: waiting for pending results... 11124 1726882380.92928: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11124 1726882380.93024: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003b6 11124 1726882380.93039: variable 'ansible_search_path' from source: unknown 11124 1726882380.93042: variable 'ansible_search_path' from source: unknown 11124 1726882380.93089: calling self._execute() 11124 1726882380.93168: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.93174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.93181: variable 'omit' from source: magic vars 11124 1726882380.93465: variable 'ansible_distribution_major_version' from source: facts 11124 1726882380.93475: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882380.93481: variable 'omit' from source: magic vars 11124 1726882380.93545: variable 'omit' from source: magic vars 11124 1726882380.93629: variable 'profile' from source: include params 11124 1726882380.93633: variable 'item' from source: include params 11124 1726882380.93694: variable 'item' from source: include params 11124 1726882380.93707: variable 'omit' from source: magic vars 11124 1726882380.93769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882380.93808: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882380.93837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882380.93853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882380.93877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882380.93899: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882380.93902: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.93904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.93990: Set connection var ansible_shell_executable to /bin/sh 11124 1726882380.93996: Set connection var ansible_shell_type to sh 11124 1726882380.94004: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882380.94016: Set connection var ansible_timeout to 10 11124 1726882380.94026: Set connection var ansible_pipelining to False 11124 1726882380.94033: Set connection var ansible_connection to ssh 11124 1726882380.94050: variable 'ansible_shell_executable' from source: unknown 11124 1726882380.94055: variable 'ansible_connection' from source: unknown 11124 1726882380.94058: variable 'ansible_module_compression' from source: unknown 11124 1726882380.94072: variable 'ansible_shell_type' from source: unknown 11124 1726882380.94075: variable 'ansible_shell_executable' from source: unknown 11124 1726882380.94077: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882380.94080: variable 'ansible_pipelining' from source: unknown 11124 1726882380.94088: variable 'ansible_timeout' from source: unknown 11124 1726882380.94091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882380.94217: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882380.94227: variable 'omit' from source: magic vars 11124 1726882380.94230: starting attempt loop 11124 1726882380.94233: running the handler 11124 1726882380.94249: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882380.94288: _low_level_execute_command(): starting 11124 1726882380.94292: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882380.95053: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.95058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.95097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.95101: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.95108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.95167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882380.95170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882380.95211: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882380.95321: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.96968: stdout chunk (state=3): >>>/root <<< 11124 1726882380.97083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882380.97156: stderr chunk (state=3): >>><<< 11124 1726882380.97162: stdout chunk (state=3): >>><<< 11124 1726882380.97188: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882380.97201: _low_level_execute_command(): starting 11124 1726882380.97208: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416 `" && echo ansible-tmp-1726882380.9718812-12127-272496228850416="` echo /root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416 `" ) && sleep 0' 11124 1726882380.97705: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882380.97712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882380.97744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882380.97747: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882380.97760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.97784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 11124 1726882380.97788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882380.97835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882380.97841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882380.97853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882380.97969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882380.99837: stdout chunk (state=3): >>>ansible-tmp-1726882380.9718812-12127-272496228850416=/root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416 <<< 11124 1726882380.99959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882381.00004: stderr chunk (state=3): >>><<< 11124 1726882381.00009: stdout chunk (state=3): >>><<< 11124 1726882381.00066: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882380.9718812-12127-272496228850416=/root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882381.00070: variable 'ansible_module_compression' from source: unknown 11124 1726882381.00125: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882381.00179: variable 'ansible_facts' from source: unknown 11124 1726882381.00252: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416/AnsiballZ_command.py 11124 1726882381.00456: Sending initial data 11124 1726882381.00459: Sent initial data (156 bytes) 11124 1726882381.01311: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882381.01315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882381.01342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882381.01373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882381.01417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882381.01434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882381.01556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882381.03279: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882381.03373: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882381.03470: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpxi1uj_o1 /root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416/AnsiballZ_command.py <<< 11124 1726882381.03562: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882381.04738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882381.04892: stderr chunk (state=3): >>><<< 11124 1726882381.04895: stdout chunk (state=3): >>><<< 11124 1726882381.04918: done transferring module to remote 11124 1726882381.04929: _low_level_execute_command(): starting 11124 1726882381.04935: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416/ /root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416/AnsiballZ_command.py && sleep 0' 11124 1726882381.05568: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882381.05571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882381.05606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882381.05609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882381.05615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882381.05622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882381.05672: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882381.05682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882381.05799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882381.07574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882381.07619: stderr chunk (state=3): >>><<< 11124 1726882381.07622: stdout chunk (state=3): >>><<< 11124 1726882381.07635: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882381.07638: _low_level_execute_command(): starting 11124 1726882381.07642: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416/AnsiballZ_command.py && sleep 0' 11124 1726882381.08097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882381.08103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882381.08141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882381.08144: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882381.08146: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882381.08148: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882381.08197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882381.08201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882381.08309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882381.33794: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:01.212131", "end": "2024-09-20 21:33:01.336541", "delta": "0:00:00.124410", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882381.35072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882381.35128: stderr chunk (state=3): >>><<< 11124 1726882381.35131: stdout chunk (state=3): >>><<< 11124 1726882381.35151: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:33:01.212131", "end": "2024-09-20 21:33:01.336541", "delta": "0:00:00.124410", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882381.35185: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882381.35190: _low_level_execute_command(): starting 11124 1726882381.35195: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882380.9718812-12127-272496228850416/ > /dev/null 2>&1 && sleep 0' 11124 1726882381.35843: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882381.35846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882381.35888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882381.35891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882381.35894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882381.35940: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882381.35947: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882381.35959: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882381.36069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882381.38111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882381.38182: stderr chunk (state=3): >>><<< 11124 1726882381.38188: stdout chunk (state=3): >>><<< 11124 1726882381.38209: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882381.38216: handler run complete 11124 1726882381.38240: Evaluated conditional (False): False 11124 1726882381.38251: attempt loop complete, returning result 11124 1726882381.38254: _execute() done 11124 1726882381.38260: dumping result to json 11124 1726882381.38272: done dumping result, returning 11124 1726882381.38280: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-8362-0f62-0000000003b6] 11124 1726882381.38285: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b6 ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.124410", "end": "2024-09-20 21:33:01.336541", "rc": 0, "start": "2024-09-20 21:33:01.212131" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 11124 1726882381.38467: no more pending results, returning what we have 11124 1726882381.38471: results queue empty 11124 1726882381.38472: checking for any_errors_fatal 11124 1726882381.38479: done checking for any_errors_fatal 11124 1726882381.38480: checking for max_fail_percentage 11124 1726882381.38482: done checking for max_fail_percentage 11124 1726882381.38483: checking to see if all hosts have failed and the running result is not ok 11124 1726882381.38484: done checking to see if all hosts have failed 11124 1726882381.38485: getting the remaining hosts for this loop 11124 1726882381.38486: done getting the remaining hosts for this loop 11124 1726882381.38490: getting the next task for host managed_node1 11124 1726882381.38497: done getting next task for host managed_node1 11124 1726882381.38499: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11124 1726882381.38504: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882381.38508: getting variables 11124 1726882381.38510: in VariableManager get_vars() 11124 1726882381.38554: Calling all_inventory to load vars for managed_node1 11124 1726882381.38557: Calling groups_inventory to load vars for managed_node1 11124 1726882381.38559: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.38567: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b6 11124 1726882381.38571: WORKER PROCESS EXITING 11124 1726882381.38681: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.38684: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.38687: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.40531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882381.41469: done with get_vars() 11124 1726882381.41487: done getting variables 11124 1726882381.41533: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:01 -0400 (0:00:00.491) 0:00:21.658 ****** 11124 1726882381.41556: entering _queue_task() for managed_node1/set_fact 11124 1726882381.41796: worker is 1 (out of 1 available) 11124 1726882381.41808: exiting _queue_task() for managed_node1/set_fact 11124 1726882381.41821: done queuing things up, now waiting for results queue to drain 11124 1726882381.41822: waiting for pending results... 11124 1726882381.42012: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11124 1726882381.42167: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003b7 11124 1726882381.42190: variable 'ansible_search_path' from source: unknown 11124 1726882381.42197: variable 'ansible_search_path' from source: unknown 11124 1726882381.42243: calling self._execute() 11124 1726882381.42338: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.42353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.42369: variable 'omit' from source: magic vars 11124 1726882381.42757: variable 'ansible_distribution_major_version' from source: facts 11124 1726882381.42783: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882381.42926: variable 'nm_profile_exists' from source: set_fact 11124 1726882381.42951: Evaluated conditional (nm_profile_exists.rc == 0): True 11124 1726882381.42965: variable 'omit' from source: magic vars 11124 1726882381.43022: variable 'omit' from source: magic vars 11124 1726882381.43061: variable 'omit' from source: magic vars 11124 1726882381.43112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882381.43147: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882381.43181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882381.43203: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882381.43226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882381.43261: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882381.43273: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.43281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.43388: Set connection var ansible_shell_executable to /bin/sh 11124 1726882381.43403: Set connection var ansible_shell_type to sh 11124 1726882381.43414: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882381.43423: Set connection var ansible_timeout to 10 11124 1726882381.43440: Set connection var ansible_pipelining to False 11124 1726882381.43447: Set connection var ansible_connection to ssh 11124 1726882381.43476: variable 'ansible_shell_executable' from source: unknown 11124 1726882381.43484: variable 'ansible_connection' from source: unknown 11124 1726882381.43490: variable 'ansible_module_compression' from source: unknown 11124 1726882381.43496: variable 'ansible_shell_type' from source: unknown 11124 1726882381.43501: variable 'ansible_shell_executable' from source: unknown 11124 1726882381.43507: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.43514: variable 'ansible_pipelining' from source: unknown 11124 1726882381.43519: variable 'ansible_timeout' from source: unknown 11124 1726882381.43526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.43678: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882381.43696: variable 'omit' from source: magic vars 11124 1726882381.43705: starting attempt loop 11124 1726882381.43711: running the handler 11124 1726882381.43727: handler run complete 11124 1726882381.43740: attempt loop complete, returning result 11124 1726882381.43746: _execute() done 11124 1726882381.43753: dumping result to json 11124 1726882381.43766: done dumping result, returning 11124 1726882381.43781: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-8362-0f62-0000000003b7] 11124 1726882381.43790: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b7 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11124 1726882381.43943: no more pending results, returning what we have 11124 1726882381.43946: results queue empty 11124 1726882381.43947: checking for any_errors_fatal 11124 1726882381.43954: done checking for any_errors_fatal 11124 1726882381.43955: checking for max_fail_percentage 11124 1726882381.43957: done checking for max_fail_percentage 11124 1726882381.43958: checking to see if all hosts have failed and the running result is not ok 11124 1726882381.43959: done checking to see if all hosts have failed 11124 1726882381.43959: getting the remaining hosts for this loop 11124 1726882381.43960: done getting the remaining hosts for this loop 11124 1726882381.43965: getting the next task for host managed_node1 11124 1726882381.43973: done getting next task for host managed_node1 11124 1726882381.43976: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11124 1726882381.43981: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882381.43984: getting variables 11124 1726882381.43986: in VariableManager get_vars() 11124 1726882381.44026: Calling all_inventory to load vars for managed_node1 11124 1726882381.44029: Calling groups_inventory to load vars for managed_node1 11124 1726882381.44031: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.44043: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.44045: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.44048: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.45171: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b7 11124 1726882381.45176: WORKER PROCESS EXITING 11124 1726882381.45846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882381.47916: done with get_vars() 11124 1726882381.47944: done getting variables 11124 1726882381.48009: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882381.48136: variable 'profile' from source: include params 11124 1726882381.48140: variable 'item' from source: include params 11124 1726882381.48203: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:01 -0400 (0:00:00.066) 0:00:21.724 ****** 11124 1726882381.48244: entering _queue_task() for managed_node1/command 11124 1726882381.48590: worker is 1 (out of 1 available) 11124 1726882381.48601: exiting _queue_task() for managed_node1/command 11124 1726882381.48612: done queuing things up, now waiting for results queue to drain 11124 1726882381.48614: waiting for pending results... 11124 1726882381.48905: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 11124 1726882381.49009: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003b9 11124 1726882381.49022: variable 'ansible_search_path' from source: unknown 11124 1726882381.49025: variable 'ansible_search_path' from source: unknown 11124 1726882381.49069: calling self._execute() 11124 1726882381.49156: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.49161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.49180: variable 'omit' from source: magic vars 11124 1726882381.49570: variable 'ansible_distribution_major_version' from source: facts 11124 1726882381.49574: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882381.49771: variable 'profile_stat' from source: set_fact 11124 1726882381.49774: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882381.49776: when evaluation is False, skipping this task 11124 1726882381.49778: _execute() done 11124 1726882381.49780: dumping result to json 11124 1726882381.49781: done dumping result, returning 11124 1726882381.49783: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [0e448fcc-3ce9-8362-0f62-0000000003b9] 11124 1726882381.49785: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b9 11124 1726882381.49846: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003b9 11124 1726882381.49850: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882381.49906: no more pending results, returning what we have 11124 1726882381.49910: results queue empty 11124 1726882381.49911: checking for any_errors_fatal 11124 1726882381.49918: done checking for any_errors_fatal 11124 1726882381.49919: checking for max_fail_percentage 11124 1726882381.49921: done checking for max_fail_percentage 11124 1726882381.49922: checking to see if all hosts have failed and the running result is not ok 11124 1726882381.49923: done checking to see if all hosts have failed 11124 1726882381.49924: getting the remaining hosts for this loop 11124 1726882381.49925: done getting the remaining hosts for this loop 11124 1726882381.49929: getting the next task for host managed_node1 11124 1726882381.49936: done getting next task for host managed_node1 11124 1726882381.49938: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11124 1726882381.49943: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882381.49949: getting variables 11124 1726882381.49953: in VariableManager get_vars() 11124 1726882381.50001: Calling all_inventory to load vars for managed_node1 11124 1726882381.50004: Calling groups_inventory to load vars for managed_node1 11124 1726882381.50006: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.50021: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.50024: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.50027: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.51580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882381.54882: done with get_vars() 11124 1726882381.54917: done getting variables 11124 1726882381.54989: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882381.55110: variable 'profile' from source: include params 11124 1726882381.55115: variable 'item' from source: include params 11124 1726882381.55179: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:01 -0400 (0:00:00.069) 0:00:21.794 ****** 11124 1726882381.55211: entering _queue_task() for managed_node1/set_fact 11124 1726882381.56497: worker is 1 (out of 1 available) 11124 1726882381.56509: exiting _queue_task() for managed_node1/set_fact 11124 1726882381.56521: done queuing things up, now waiting for results queue to drain 11124 1726882381.56522: waiting for pending results... 11124 1726882381.56814: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 11124 1726882381.56913: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003ba 11124 1726882381.56927: variable 'ansible_search_path' from source: unknown 11124 1726882381.56930: variable 'ansible_search_path' from source: unknown 11124 1726882381.56971: calling self._execute() 11124 1726882381.57059: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.57066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.57082: variable 'omit' from source: magic vars 11124 1726882381.57446: variable 'ansible_distribution_major_version' from source: facts 11124 1726882381.57462: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882381.57589: variable 'profile_stat' from source: set_fact 11124 1726882381.57601: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882381.57604: when evaluation is False, skipping this task 11124 1726882381.57607: _execute() done 11124 1726882381.57610: dumping result to json 11124 1726882381.57612: done dumping result, returning 11124 1726882381.57625: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0e448fcc-3ce9-8362-0f62-0000000003ba] 11124 1726882381.57631: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003ba skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882381.57777: no more pending results, returning what we have 11124 1726882381.57782: results queue empty 11124 1726882381.57784: checking for any_errors_fatal 11124 1726882381.57790: done checking for any_errors_fatal 11124 1726882381.57791: checking for max_fail_percentage 11124 1726882381.57793: done checking for max_fail_percentage 11124 1726882381.57794: checking to see if all hosts have failed and the running result is not ok 11124 1726882381.57795: done checking to see if all hosts have failed 11124 1726882381.57796: getting the remaining hosts for this loop 11124 1726882381.57797: done getting the remaining hosts for this loop 11124 1726882381.57801: getting the next task for host managed_node1 11124 1726882381.57807: done getting next task for host managed_node1 11124 1726882381.57810: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11124 1726882381.57815: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882381.57819: getting variables 11124 1726882381.57821: in VariableManager get_vars() 11124 1726882381.57871: Calling all_inventory to load vars for managed_node1 11124 1726882381.57875: Calling groups_inventory to load vars for managed_node1 11124 1726882381.57877: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.57893: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.57897: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.57901: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.58616: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003ba 11124 1726882381.58619: WORKER PROCESS EXITING 11124 1726882381.60600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882381.63539: done with get_vars() 11124 1726882381.63578: done getting variables 11124 1726882381.63658: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882381.63783: variable 'profile' from source: include params 11124 1726882381.63787: variable 'item' from source: include params 11124 1726882381.63847: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:01 -0400 (0:00:00.086) 0:00:21.881 ****** 11124 1726882381.63885: entering _queue_task() for managed_node1/command 11124 1726882381.64318: worker is 1 (out of 1 available) 11124 1726882381.64331: exiting _queue_task() for managed_node1/command 11124 1726882381.64344: done queuing things up, now waiting for results queue to drain 11124 1726882381.64346: waiting for pending results... 11124 1726882381.64833: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 11124 1726882381.64951: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003bb 11124 1726882381.64969: variable 'ansible_search_path' from source: unknown 11124 1726882381.64973: variable 'ansible_search_path' from source: unknown 11124 1726882381.65014: calling self._execute() 11124 1726882381.65129: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.65135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.65292: variable 'omit' from source: magic vars 11124 1726882381.65755: variable 'ansible_distribution_major_version' from source: facts 11124 1726882381.65771: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882381.66009: variable 'profile_stat' from source: set_fact 11124 1726882381.66025: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882381.66028: when evaluation is False, skipping this task 11124 1726882381.66032: _execute() done 11124 1726882381.66041: dumping result to json 11124 1726882381.66044: done dumping result, returning 11124 1726882381.66055: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 [0e448fcc-3ce9-8362-0f62-0000000003bb] 11124 1726882381.66058: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003bb 11124 1726882381.66153: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003bb 11124 1726882381.66156: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882381.66216: no more pending results, returning what we have 11124 1726882381.66221: results queue empty 11124 1726882381.66222: checking for any_errors_fatal 11124 1726882381.66232: done checking for any_errors_fatal 11124 1726882381.66233: checking for max_fail_percentage 11124 1726882381.66235: done checking for max_fail_percentage 11124 1726882381.66236: checking to see if all hosts have failed and the running result is not ok 11124 1726882381.66237: done checking to see if all hosts have failed 11124 1726882381.66238: getting the remaining hosts for this loop 11124 1726882381.66239: done getting the remaining hosts for this loop 11124 1726882381.66244: getting the next task for host managed_node1 11124 1726882381.66253: done getting next task for host managed_node1 11124 1726882381.66256: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11124 1726882381.66262: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882381.66269: getting variables 11124 1726882381.66272: in VariableManager get_vars() 11124 1726882381.66319: Calling all_inventory to load vars for managed_node1 11124 1726882381.66323: Calling groups_inventory to load vars for managed_node1 11124 1726882381.66325: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.66341: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.66344: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.66347: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.68658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882381.71162: done with get_vars() 11124 1726882381.71231: done getting variables 11124 1726882381.71308: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882381.71456: variable 'profile' from source: include params 11124 1726882381.71461: variable 'item' from source: include params 11124 1726882381.71525: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:01 -0400 (0:00:00.076) 0:00:21.958 ****** 11124 1726882381.71569: entering _queue_task() for managed_node1/set_fact 11124 1726882381.71915: worker is 1 (out of 1 available) 11124 1726882381.71928: exiting _queue_task() for managed_node1/set_fact 11124 1726882381.71940: done queuing things up, now waiting for results queue to drain 11124 1726882381.71942: waiting for pending results... 11124 1726882381.72241: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 11124 1726882381.72375: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003bc 11124 1726882381.72427: variable 'ansible_search_path' from source: unknown 11124 1726882381.72436: variable 'ansible_search_path' from source: unknown 11124 1726882381.72481: calling self._execute() 11124 1726882381.72584: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.72595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.72613: variable 'omit' from source: magic vars 11124 1726882381.72967: variable 'ansible_distribution_major_version' from source: facts 11124 1726882381.72980: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882381.73076: variable 'profile_stat' from source: set_fact 11124 1726882381.73091: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882381.73094: when evaluation is False, skipping this task 11124 1726882381.73096: _execute() done 11124 1726882381.73099: dumping result to json 11124 1726882381.73101: done dumping result, returning 11124 1726882381.73107: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [0e448fcc-3ce9-8362-0f62-0000000003bc] 11124 1726882381.73112: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003bc 11124 1726882381.73207: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003bc 11124 1726882381.73210: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882381.73276: no more pending results, returning what we have 11124 1726882381.73280: results queue empty 11124 1726882381.73282: checking for any_errors_fatal 11124 1726882381.73287: done checking for any_errors_fatal 11124 1726882381.73287: checking for max_fail_percentage 11124 1726882381.73289: done checking for max_fail_percentage 11124 1726882381.73290: checking to see if all hosts have failed and the running result is not ok 11124 1726882381.73291: done checking to see if all hosts have failed 11124 1726882381.73292: getting the remaining hosts for this loop 11124 1726882381.73294: done getting the remaining hosts for this loop 11124 1726882381.73297: getting the next task for host managed_node1 11124 1726882381.73305: done getting next task for host managed_node1 11124 1726882381.73308: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11124 1726882381.73312: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882381.73315: getting variables 11124 1726882381.73317: in VariableManager get_vars() 11124 1726882381.73367: Calling all_inventory to load vars for managed_node1 11124 1726882381.73371: Calling groups_inventory to load vars for managed_node1 11124 1726882381.73373: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.73388: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.73392: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.73396: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.74996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882381.76919: done with get_vars() 11124 1726882381.76955: done getting variables 11124 1726882381.77021: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882381.77173: variable 'profile' from source: include params 11124 1726882381.77177: variable 'item' from source: include params 11124 1726882381.77228: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:01 -0400 (0:00:00.056) 0:00:22.015 ****** 11124 1726882381.77253: entering _queue_task() for managed_node1/assert 11124 1726882381.77568: worker is 1 (out of 1 available) 11124 1726882381.77581: exiting _queue_task() for managed_node1/assert 11124 1726882381.77593: done queuing things up, now waiting for results queue to drain 11124 1726882381.77595: waiting for pending results... 11124 1726882381.77784: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' 11124 1726882381.78195: in run() - task 0e448fcc-3ce9-8362-0f62-000000000261 11124 1726882381.78199: variable 'ansible_search_path' from source: unknown 11124 1726882381.78202: variable 'ansible_search_path' from source: unknown 11124 1726882381.78205: calling self._execute() 11124 1726882381.78208: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.78210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.78213: variable 'omit' from source: magic vars 11124 1726882381.78357: variable 'ansible_distribution_major_version' from source: facts 11124 1726882381.78370: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882381.78377: variable 'omit' from source: magic vars 11124 1726882381.78417: variable 'omit' from source: magic vars 11124 1726882381.78515: variable 'profile' from source: include params 11124 1726882381.78518: variable 'item' from source: include params 11124 1726882381.78578: variable 'item' from source: include params 11124 1726882381.78596: variable 'omit' from source: magic vars 11124 1726882381.78636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882381.78673: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882381.78693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882381.78710: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882381.78719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882381.78748: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882381.78754: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.78757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.78848: Set connection var ansible_shell_executable to /bin/sh 11124 1726882381.78853: Set connection var ansible_shell_type to sh 11124 1726882381.78862: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882381.78869: Set connection var ansible_timeout to 10 11124 1726882381.78875: Set connection var ansible_pipelining to False 11124 1726882381.78879: Set connection var ansible_connection to ssh 11124 1726882381.78898: variable 'ansible_shell_executable' from source: unknown 11124 1726882381.78901: variable 'ansible_connection' from source: unknown 11124 1726882381.78904: variable 'ansible_module_compression' from source: unknown 11124 1726882381.78906: variable 'ansible_shell_type' from source: unknown 11124 1726882381.78908: variable 'ansible_shell_executable' from source: unknown 11124 1726882381.78910: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.78915: variable 'ansible_pipelining' from source: unknown 11124 1726882381.78917: variable 'ansible_timeout' from source: unknown 11124 1726882381.78921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.79202: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882381.79207: variable 'omit' from source: magic vars 11124 1726882381.79210: starting attempt loop 11124 1726882381.79212: running the handler 11124 1726882381.79363: variable 'lsr_net_profile_exists' from source: set_fact 11124 1726882381.79576: Evaluated conditional (lsr_net_profile_exists): True 11124 1726882381.79587: handler run complete 11124 1726882381.79616: attempt loop complete, returning result 11124 1726882381.79622: _execute() done 11124 1726882381.79638: dumping result to json 11124 1726882381.79654: done dumping result, returning 11124 1726882381.79677: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' [0e448fcc-3ce9-8362-0f62-000000000261] 11124 1726882381.79686: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000261 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882381.79844: no more pending results, returning what we have 11124 1726882381.79847: results queue empty 11124 1726882381.79848: checking for any_errors_fatal 11124 1726882381.79856: done checking for any_errors_fatal 11124 1726882381.79857: checking for max_fail_percentage 11124 1726882381.79860: done checking for max_fail_percentage 11124 1726882381.79861: checking to see if all hosts have failed and the running result is not ok 11124 1726882381.79862: done checking to see if all hosts have failed 11124 1726882381.79864: getting the remaining hosts for this loop 11124 1726882381.79867: done getting the remaining hosts for this loop 11124 1726882381.79870: getting the next task for host managed_node1 11124 1726882381.79877: done getting next task for host managed_node1 11124 1726882381.79879: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11124 1726882381.79883: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882381.79890: getting variables 11124 1726882381.79892: in VariableManager get_vars() 11124 1726882381.79939: Calling all_inventory to load vars for managed_node1 11124 1726882381.79943: Calling groups_inventory to load vars for managed_node1 11124 1726882381.79945: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.79960: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.79967: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.79970: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.80758: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000261 11124 1726882381.80761: WORKER PROCESS EXITING 11124 1726882381.81583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882381.83320: done with get_vars() 11124 1726882381.83340: done getting variables 11124 1726882381.83390: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882381.83479: variable 'profile' from source: include params 11124 1726882381.83483: variable 'item' from source: include params 11124 1726882381.83523: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:01 -0400 (0:00:00.062) 0:00:22.077 ****** 11124 1726882381.83548: entering _queue_task() for managed_node1/assert 11124 1726882381.83787: worker is 1 (out of 1 available) 11124 1726882381.83799: exiting _queue_task() for managed_node1/assert 11124 1726882381.83812: done queuing things up, now waiting for results queue to drain 11124 1726882381.83814: waiting for pending results... 11124 1726882381.83989: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' 11124 1726882381.84059: in run() - task 0e448fcc-3ce9-8362-0f62-000000000262 11124 1726882381.84072: variable 'ansible_search_path' from source: unknown 11124 1726882381.84075: variable 'ansible_search_path' from source: unknown 11124 1726882381.84104: calling self._execute() 11124 1726882381.84197: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.84206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.84219: variable 'omit' from source: magic vars 11124 1726882381.84612: variable 'ansible_distribution_major_version' from source: facts 11124 1726882381.84630: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882381.84640: variable 'omit' from source: magic vars 11124 1726882381.84692: variable 'omit' from source: magic vars 11124 1726882381.84801: variable 'profile' from source: include params 11124 1726882381.84809: variable 'item' from source: include params 11124 1726882381.84875: variable 'item' from source: include params 11124 1726882381.84907: variable 'omit' from source: magic vars 11124 1726882381.84967: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882381.85023: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882381.85050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882381.85086: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882381.85108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882381.85157: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882381.85170: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.85187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.85358: Set connection var ansible_shell_executable to /bin/sh 11124 1726882381.85382: Set connection var ansible_shell_type to sh 11124 1726882381.85404: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882381.85418: Set connection var ansible_timeout to 10 11124 1726882381.85434: Set connection var ansible_pipelining to False 11124 1726882381.85461: Set connection var ansible_connection to ssh 11124 1726882381.85502: variable 'ansible_shell_executable' from source: unknown 11124 1726882381.85515: variable 'ansible_connection' from source: unknown 11124 1726882381.85523: variable 'ansible_module_compression' from source: unknown 11124 1726882381.85533: variable 'ansible_shell_type' from source: unknown 11124 1726882381.85544: variable 'ansible_shell_executable' from source: unknown 11124 1726882381.85564: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.85581: variable 'ansible_pipelining' from source: unknown 11124 1726882381.85598: variable 'ansible_timeout' from source: unknown 11124 1726882381.85610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.85819: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882381.85847: variable 'omit' from source: magic vars 11124 1726882381.85866: starting attempt loop 11124 1726882381.85874: running the handler 11124 1726882381.86036: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11124 1726882381.86057: Evaluated conditional (lsr_net_profile_ansible_managed): True 11124 1726882381.86071: handler run complete 11124 1726882381.86091: attempt loop complete, returning result 11124 1726882381.86098: _execute() done 11124 1726882381.86114: dumping result to json 11124 1726882381.86122: done dumping result, returning 11124 1726882381.86134: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' [0e448fcc-3ce9-8362-0f62-000000000262] 11124 1726882381.86143: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000262 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882381.86306: no more pending results, returning what we have 11124 1726882381.86309: results queue empty 11124 1726882381.86310: checking for any_errors_fatal 11124 1726882381.86316: done checking for any_errors_fatal 11124 1726882381.86317: checking for max_fail_percentage 11124 1726882381.86318: done checking for max_fail_percentage 11124 1726882381.86319: checking to see if all hosts have failed and the running result is not ok 11124 1726882381.86320: done checking to see if all hosts have failed 11124 1726882381.86321: getting the remaining hosts for this loop 11124 1726882381.86322: done getting the remaining hosts for this loop 11124 1726882381.86326: getting the next task for host managed_node1 11124 1726882381.86331: done getting next task for host managed_node1 11124 1726882381.86333: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11124 1726882381.86337: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882381.86341: getting variables 11124 1726882381.86343: in VariableManager get_vars() 11124 1726882381.86403: Calling all_inventory to load vars for managed_node1 11124 1726882381.86409: Calling groups_inventory to load vars for managed_node1 11124 1726882381.86412: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.86430: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.86437: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.86441: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.86985: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000262 11124 1726882381.86992: WORKER PROCESS EXITING 11124 1726882381.88118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882381.89483: done with get_vars() 11124 1726882381.89502: done getting variables 11124 1726882381.89544: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882381.89640: variable 'profile' from source: include params 11124 1726882381.89643: variable 'item' from source: include params 11124 1726882381.89721: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:01 -0400 (0:00:00.062) 0:00:22.140 ****** 11124 1726882381.89765: entering _queue_task() for managed_node1/assert 11124 1726882381.90007: worker is 1 (out of 1 available) 11124 1726882381.90021: exiting _queue_task() for managed_node1/assert 11124 1726882381.90034: done queuing things up, now waiting for results queue to drain 11124 1726882381.90035: waiting for pending results... 11124 1726882381.90206: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 11124 1726882381.90263: in run() - task 0e448fcc-3ce9-8362-0f62-000000000263 11124 1726882381.90277: variable 'ansible_search_path' from source: unknown 11124 1726882381.90280: variable 'ansible_search_path' from source: unknown 11124 1726882381.90308: calling self._execute() 11124 1726882381.90384: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.90388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.90396: variable 'omit' from source: magic vars 11124 1726882381.90659: variable 'ansible_distribution_major_version' from source: facts 11124 1726882381.90673: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882381.90681: variable 'omit' from source: magic vars 11124 1726882381.90707: variable 'omit' from source: magic vars 11124 1726882381.90777: variable 'profile' from source: include params 11124 1726882381.90781: variable 'item' from source: include params 11124 1726882381.90825: variable 'item' from source: include params 11124 1726882381.90839: variable 'omit' from source: magic vars 11124 1726882381.90875: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882381.90900: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882381.90916: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882381.90937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882381.90940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882381.90972: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882381.90976: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.90979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.91048: Set connection var ansible_shell_executable to /bin/sh 11124 1726882381.91054: Set connection var ansible_shell_type to sh 11124 1726882381.91062: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882381.91069: Set connection var ansible_timeout to 10 11124 1726882381.91073: Set connection var ansible_pipelining to False 11124 1726882381.91077: Set connection var ansible_connection to ssh 11124 1726882381.91095: variable 'ansible_shell_executable' from source: unknown 11124 1726882381.91098: variable 'ansible_connection' from source: unknown 11124 1726882381.91101: variable 'ansible_module_compression' from source: unknown 11124 1726882381.91103: variable 'ansible_shell_type' from source: unknown 11124 1726882381.91105: variable 'ansible_shell_executable' from source: unknown 11124 1726882381.91107: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.91111: variable 'ansible_pipelining' from source: unknown 11124 1726882381.91114: variable 'ansible_timeout' from source: unknown 11124 1726882381.91118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.91218: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882381.91227: variable 'omit' from source: magic vars 11124 1726882381.91233: starting attempt loop 11124 1726882381.91236: running the handler 11124 1726882381.91314: variable 'lsr_net_profile_fingerprint' from source: set_fact 11124 1726882381.91317: Evaluated conditional (lsr_net_profile_fingerprint): True 11124 1726882381.91319: handler run complete 11124 1726882381.91331: attempt loop complete, returning result 11124 1726882381.91334: _execute() done 11124 1726882381.91336: dumping result to json 11124 1726882381.91339: done dumping result, returning 11124 1726882381.91347: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 [0e448fcc-3ce9-8362-0f62-000000000263] 11124 1726882381.91352: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000263 11124 1726882381.91435: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000263 11124 1726882381.91438: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882381.91507: no more pending results, returning what we have 11124 1726882381.91510: results queue empty 11124 1726882381.91511: checking for any_errors_fatal 11124 1726882381.91519: done checking for any_errors_fatal 11124 1726882381.91520: checking for max_fail_percentage 11124 1726882381.91521: done checking for max_fail_percentage 11124 1726882381.91528: checking to see if all hosts have failed and the running result is not ok 11124 1726882381.91529: done checking to see if all hosts have failed 11124 1726882381.91530: getting the remaining hosts for this loop 11124 1726882381.91531: done getting the remaining hosts for this loop 11124 1726882381.91535: getting the next task for host managed_node1 11124 1726882381.91544: done getting next task for host managed_node1 11124 1726882381.91547: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11124 1726882381.91552: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882381.91556: getting variables 11124 1726882381.91557: in VariableManager get_vars() 11124 1726882381.91593: Calling all_inventory to load vars for managed_node1 11124 1726882381.91596: Calling groups_inventory to load vars for managed_node1 11124 1726882381.91598: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.91607: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.91610: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.91612: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.92961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882381.94315: done with get_vars() 11124 1726882381.94333: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:01 -0400 (0:00:00.046) 0:00:22.186 ****** 11124 1726882381.94417: entering _queue_task() for managed_node1/include_tasks 11124 1726882381.94670: worker is 1 (out of 1 available) 11124 1726882381.94684: exiting _queue_task() for managed_node1/include_tasks 11124 1726882381.94697: done queuing things up, now waiting for results queue to drain 11124 1726882381.94698: waiting for pending results... 11124 1726882381.94870: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11124 1726882381.94951: in run() - task 0e448fcc-3ce9-8362-0f62-000000000267 11124 1726882381.94965: variable 'ansible_search_path' from source: unknown 11124 1726882381.94969: variable 'ansible_search_path' from source: unknown 11124 1726882381.94998: calling self._execute() 11124 1726882381.95074: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882381.95078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882381.95086: variable 'omit' from source: magic vars 11124 1726882381.95421: variable 'ansible_distribution_major_version' from source: facts 11124 1726882381.95460: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882381.95469: _execute() done 11124 1726882381.95473: dumping result to json 11124 1726882381.95476: done dumping result, returning 11124 1726882381.95481: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-8362-0f62-000000000267] 11124 1726882381.95487: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000267 11124 1726882381.95667: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000267 11124 1726882381.95669: WORKER PROCESS EXITING 11124 1726882381.95695: no more pending results, returning what we have 11124 1726882381.95699: in VariableManager get_vars() 11124 1726882381.95769: Calling all_inventory to load vars for managed_node1 11124 1726882381.95772: Calling groups_inventory to load vars for managed_node1 11124 1726882381.95774: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882381.95787: Calling all_plugins_play to load vars for managed_node1 11124 1726882381.95790: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882381.95793: Calling groups_plugins_play to load vars for managed_node1 11124 1726882381.96707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882382.02529: done with get_vars() 11124 1726882382.02557: variable 'ansible_search_path' from source: unknown 11124 1726882382.02559: variable 'ansible_search_path' from source: unknown 11124 1726882382.02612: we have included files to process 11124 1726882382.02613: generating all_blocks data 11124 1726882382.02614: done generating all_blocks data 11124 1726882382.02618: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11124 1726882382.02619: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11124 1726882382.02621: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11124 1726882382.03451: done processing included file 11124 1726882382.03453: iterating over new_blocks loaded from include file 11124 1726882382.03454: in VariableManager get_vars() 11124 1726882382.03482: done with get_vars() 11124 1726882382.03484: filtering new block on tags 11124 1726882382.03502: done filtering new block on tags 11124 1726882382.03504: in VariableManager get_vars() 11124 1726882382.03516: done with get_vars() 11124 1726882382.03518: filtering new block on tags 11124 1726882382.03545: done filtering new block on tags 11124 1726882382.03547: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11124 1726882382.03553: extending task lists for all hosts with included blocks 11124 1726882382.03695: done extending task lists 11124 1726882382.03696: done processing included files 11124 1726882382.03697: results queue empty 11124 1726882382.03697: checking for any_errors_fatal 11124 1726882382.03699: done checking for any_errors_fatal 11124 1726882382.03699: checking for max_fail_percentage 11124 1726882382.03700: done checking for max_fail_percentage 11124 1726882382.03701: checking to see if all hosts have failed and the running result is not ok 11124 1726882382.03701: done checking to see if all hosts have failed 11124 1726882382.03702: getting the remaining hosts for this loop 11124 1726882382.03702: done getting the remaining hosts for this loop 11124 1726882382.03704: getting the next task for host managed_node1 11124 1726882382.03706: done getting next task for host managed_node1 11124 1726882382.03707: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11124 1726882382.03709: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882382.03711: getting variables 11124 1726882382.03711: in VariableManager get_vars() 11124 1726882382.03730: Calling all_inventory to load vars for managed_node1 11124 1726882382.03732: Calling groups_inventory to load vars for managed_node1 11124 1726882382.03735: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882382.03739: Calling all_plugins_play to load vars for managed_node1 11124 1726882382.03741: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882382.03742: Calling groups_plugins_play to load vars for managed_node1 11124 1726882382.04669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882382.06273: done with get_vars() 11124 1726882382.06298: done getting variables 11124 1726882382.06344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:02 -0400 (0:00:00.119) 0:00:22.306 ****** 11124 1726882382.06378: entering _queue_task() for managed_node1/set_fact 11124 1726882382.06739: worker is 1 (out of 1 available) 11124 1726882382.06752: exiting _queue_task() for managed_node1/set_fact 11124 1726882382.06767: done queuing things up, now waiting for results queue to drain 11124 1726882382.06769: waiting for pending results... 11124 1726882382.07094: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11124 1726882382.07199: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003fb 11124 1726882382.07216: variable 'ansible_search_path' from source: unknown 11124 1726882382.07219: variable 'ansible_search_path' from source: unknown 11124 1726882382.07253: calling self._execute() 11124 1726882382.07362: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.07369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.07376: variable 'omit' from source: magic vars 11124 1726882382.07834: variable 'ansible_distribution_major_version' from source: facts 11124 1726882382.07850: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882382.07883: variable 'omit' from source: magic vars 11124 1726882382.07928: variable 'omit' from source: magic vars 11124 1726882382.07984: variable 'omit' from source: magic vars 11124 1726882382.08011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882382.08047: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882382.08072: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882382.08089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882382.08104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882382.08153: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882382.08156: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.08159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.08252: Set connection var ansible_shell_executable to /bin/sh 11124 1726882382.08263: Set connection var ansible_shell_type to sh 11124 1726882382.08272: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882382.08278: Set connection var ansible_timeout to 10 11124 1726882382.08283: Set connection var ansible_pipelining to False 11124 1726882382.08287: Set connection var ansible_connection to ssh 11124 1726882382.08307: variable 'ansible_shell_executable' from source: unknown 11124 1726882382.08310: variable 'ansible_connection' from source: unknown 11124 1726882382.08316: variable 'ansible_module_compression' from source: unknown 11124 1726882382.08319: variable 'ansible_shell_type' from source: unknown 11124 1726882382.08326: variable 'ansible_shell_executable' from source: unknown 11124 1726882382.08328: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.08333: variable 'ansible_pipelining' from source: unknown 11124 1726882382.08336: variable 'ansible_timeout' from source: unknown 11124 1726882382.08341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.08521: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882382.08536: variable 'omit' from source: magic vars 11124 1726882382.08540: starting attempt loop 11124 1726882382.08545: running the handler 11124 1726882382.08591: handler run complete 11124 1726882382.08594: attempt loop complete, returning result 11124 1726882382.08627: _execute() done 11124 1726882382.08652: dumping result to json 11124 1726882382.08659: done dumping result, returning 11124 1726882382.08687: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-8362-0f62-0000000003fb] 11124 1726882382.08726: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003fb 11124 1726882382.08823: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003fb 11124 1726882382.08827: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11124 1726882382.08903: no more pending results, returning what we have 11124 1726882382.08910: results queue empty 11124 1726882382.08912: checking for any_errors_fatal 11124 1726882382.08917: done checking for any_errors_fatal 11124 1726882382.08918: checking for max_fail_percentage 11124 1726882382.08920: done checking for max_fail_percentage 11124 1726882382.08921: checking to see if all hosts have failed and the running result is not ok 11124 1726882382.08922: done checking to see if all hosts have failed 11124 1726882382.08923: getting the remaining hosts for this loop 11124 1726882382.08924: done getting the remaining hosts for this loop 11124 1726882382.08928: getting the next task for host managed_node1 11124 1726882382.08935: done getting next task for host managed_node1 11124 1726882382.08937: ^ task is: TASK: Stat profile file 11124 1726882382.08942: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882382.08950: getting variables 11124 1726882382.08952: in VariableManager get_vars() 11124 1726882382.09014: Calling all_inventory to load vars for managed_node1 11124 1726882382.09017: Calling groups_inventory to load vars for managed_node1 11124 1726882382.09020: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882382.09036: Calling all_plugins_play to load vars for managed_node1 11124 1726882382.09040: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882382.09044: Calling groups_plugins_play to load vars for managed_node1 11124 1726882382.10830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882382.12557: done with get_vars() 11124 1726882382.12588: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:02 -0400 (0:00:00.063) 0:00:22.369 ****** 11124 1726882382.12697: entering _queue_task() for managed_node1/stat 11124 1726882382.13035: worker is 1 (out of 1 available) 11124 1726882382.13046: exiting _queue_task() for managed_node1/stat 11124 1726882382.13065: done queuing things up, now waiting for results queue to drain 11124 1726882382.13068: waiting for pending results... 11124 1726882382.13355: running TaskExecutor() for managed_node1/TASK: Stat profile file 11124 1726882382.13457: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003fc 11124 1726882382.13474: variable 'ansible_search_path' from source: unknown 11124 1726882382.13478: variable 'ansible_search_path' from source: unknown 11124 1726882382.13519: calling self._execute() 11124 1726882382.13652: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.13656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.13658: variable 'omit' from source: magic vars 11124 1726882382.14188: variable 'ansible_distribution_major_version' from source: facts 11124 1726882382.14255: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882382.14324: variable 'omit' from source: magic vars 11124 1726882382.14328: variable 'omit' from source: magic vars 11124 1726882382.14331: variable 'profile' from source: include params 11124 1726882382.14335: variable 'item' from source: include params 11124 1726882382.14472: variable 'item' from source: include params 11124 1726882382.14475: variable 'omit' from source: magic vars 11124 1726882382.14478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882382.14480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882382.14623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882382.14627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882382.14629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882382.14632: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882382.14635: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.14637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.14720: Set connection var ansible_shell_executable to /bin/sh 11124 1726882382.14723: Set connection var ansible_shell_type to sh 11124 1726882382.14726: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882382.14738: Set connection var ansible_timeout to 10 11124 1726882382.14756: Set connection var ansible_pipelining to False 11124 1726882382.14767: Set connection var ansible_connection to ssh 11124 1726882382.14819: variable 'ansible_shell_executable' from source: unknown 11124 1726882382.14824: variable 'ansible_connection' from source: unknown 11124 1726882382.14837: variable 'ansible_module_compression' from source: unknown 11124 1726882382.14840: variable 'ansible_shell_type' from source: unknown 11124 1726882382.14842: variable 'ansible_shell_executable' from source: unknown 11124 1726882382.14883: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.14886: variable 'ansible_pipelining' from source: unknown 11124 1726882382.14888: variable 'ansible_timeout' from source: unknown 11124 1726882382.14927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.15224: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882382.15240: variable 'omit' from source: magic vars 11124 1726882382.15243: starting attempt loop 11124 1726882382.15245: running the handler 11124 1726882382.15248: _low_level_execute_command(): starting 11124 1726882382.15262: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882382.16090: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882382.16102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.16112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.16126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.16172: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.16181: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882382.16192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.16204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882382.16212: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882382.16218: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882382.16226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.16235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.16246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.16257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.16269: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882382.16276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.16368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.16372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.16378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.16691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.18212: stdout chunk (state=3): >>>/root <<< 11124 1726882382.18387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882382.18392: stdout chunk (state=3): >>><<< 11124 1726882382.18402: stderr chunk (state=3): >>><<< 11124 1726882382.18430: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882382.18444: _low_level_execute_command(): starting 11124 1726882382.18449: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557 `" && echo ansible-tmp-1726882382.1842992-12183-110554921773557="` echo /root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557 `" ) && sleep 0' 11124 1726882382.19082: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882382.19090: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.19101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.19114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.19151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.19160: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882382.19172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.19184: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882382.19195: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882382.19198: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882382.19206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.19215: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.19226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.19231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.19239: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882382.19248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.19339: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.19348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.19351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.19669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.21469: stdout chunk (state=3): >>>ansible-tmp-1726882382.1842992-12183-110554921773557=/root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557 <<< 11124 1726882382.21582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882382.21681: stderr chunk (state=3): >>><<< 11124 1726882382.21685: stdout chunk (state=3): >>><<< 11124 1726882382.21773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882382.1842992-12183-110554921773557=/root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882382.21777: variable 'ansible_module_compression' from source: unknown 11124 1726882382.21880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11124 1726882382.21883: variable 'ansible_facts' from source: unknown 11124 1726882382.21954: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557/AnsiballZ_stat.py 11124 1726882382.22500: Sending initial data 11124 1726882382.22503: Sent initial data (153 bytes) 11124 1726882382.23596: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882382.23613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.23641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.23671: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.23733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.23746: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882382.23768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.23789: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882382.23802: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882382.23814: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882382.23827: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.23845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.23868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.23885: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.23897: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882382.23911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.23996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.24013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.24027: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.24159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.25936: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882382.26027: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882382.26132: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmperqllkv7 /root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557/AnsiballZ_stat.py <<< 11124 1726882382.26214: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882382.27595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882382.27683: stderr chunk (state=3): >>><<< 11124 1726882382.27687: stdout chunk (state=3): >>><<< 11124 1726882382.27708: done transferring module to remote 11124 1726882382.27720: _low_level_execute_command(): starting 11124 1726882382.27727: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557/ /root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557/AnsiballZ_stat.py && sleep 0' 11124 1726882382.28422: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882382.28432: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.28441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.28456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.28502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.28509: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882382.28519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.28532: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882382.28539: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882382.28545: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882382.28554: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.28561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.28580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.28590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.28597: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882382.28606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.28676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.28701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.28714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.28831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.30616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882382.30695: stderr chunk (state=3): >>><<< 11124 1726882382.30699: stdout chunk (state=3): >>><<< 11124 1726882382.30716: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882382.30719: _low_level_execute_command(): starting 11124 1726882382.30725: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557/AnsiballZ_stat.py && sleep 0' 11124 1726882382.31339: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882382.31347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.31358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.31375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.31412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.31420: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882382.31429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.31443: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882382.31455: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882382.31458: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882382.31477: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.31480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.31487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.31495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.31502: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882382.31511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.31597: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.31606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.31610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.31740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.44917: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11124 1726882382.45971: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882382.46002: stdout chunk (state=3): >>><<< 11124 1726882382.46005: stderr chunk (state=3): >>><<< 11124 1726882382.46072: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882382.46079: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882382.46082: _low_level_execute_command(): starting 11124 1726882382.46162: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882382.1842992-12183-110554921773557/ > /dev/null 2>&1 && sleep 0' 11124 1726882382.46725: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.46748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.46785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.46790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.46800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.46805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.46812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.46817: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882382.46823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.46882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.46885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.46895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.47000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.48882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882382.48925: stderr chunk (state=3): >>><<< 11124 1726882382.48928: stdout chunk (state=3): >>><<< 11124 1726882382.48976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882382.48980: handler run complete 11124 1726882382.49974: attempt loop complete, returning result 11124 1726882382.49978: _execute() done 11124 1726882382.49981: dumping result to json 11124 1726882382.49983: done dumping result, returning 11124 1726882382.49985: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-8362-0f62-0000000003fc] 11124 1726882382.49987: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003fc 11124 1726882382.50060: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003fc ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11124 1726882382.50128: no more pending results, returning what we have 11124 1726882382.50131: results queue empty 11124 1726882382.50132: checking for any_errors_fatal 11124 1726882382.50137: done checking for any_errors_fatal 11124 1726882382.50138: checking for max_fail_percentage 11124 1726882382.50140: done checking for max_fail_percentage 11124 1726882382.50141: checking to see if all hosts have failed and the running result is not ok 11124 1726882382.50142: done checking to see if all hosts have failed 11124 1726882382.50142: getting the remaining hosts for this loop 11124 1726882382.50144: done getting the remaining hosts for this loop 11124 1726882382.50147: getting the next task for host managed_node1 11124 1726882382.50157: done getting next task for host managed_node1 11124 1726882382.50160: ^ task is: TASK: Set NM profile exist flag based on the profile files 11124 1726882382.50180: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882382.50185: getting variables 11124 1726882382.50186: in VariableManager get_vars() 11124 1726882382.50224: Calling all_inventory to load vars for managed_node1 11124 1726882382.50226: Calling groups_inventory to load vars for managed_node1 11124 1726882382.50229: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882382.50237: WORKER PROCESS EXITING 11124 1726882382.50248: Calling all_plugins_play to load vars for managed_node1 11124 1726882382.50253: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882382.50256: Calling groups_plugins_play to load vars for managed_node1 11124 1726882382.52171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882382.54309: done with get_vars() 11124 1726882382.54333: done getting variables 11124 1726882382.54411: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:02 -0400 (0:00:00.417) 0:00:22.787 ****** 11124 1726882382.54487: entering _queue_task() for managed_node1/set_fact 11124 1726882382.55001: worker is 1 (out of 1 available) 11124 1726882382.55014: exiting _queue_task() for managed_node1/set_fact 11124 1726882382.55026: done queuing things up, now waiting for results queue to drain 11124 1726882382.55028: waiting for pending results... 11124 1726882382.55358: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11124 1726882382.55476: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003fd 11124 1726882382.55496: variable 'ansible_search_path' from source: unknown 11124 1726882382.55500: variable 'ansible_search_path' from source: unknown 11124 1726882382.55535: calling self._execute() 11124 1726882382.55648: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.55662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.55674: variable 'omit' from source: magic vars 11124 1726882382.56168: variable 'ansible_distribution_major_version' from source: facts 11124 1726882382.56180: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882382.56310: variable 'profile_stat' from source: set_fact 11124 1726882382.56353: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882382.56356: when evaluation is False, skipping this task 11124 1726882382.56359: _execute() done 11124 1726882382.56368: dumping result to json 11124 1726882382.56371: done dumping result, returning 11124 1726882382.56377: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-8362-0f62-0000000003fd] 11124 1726882382.56384: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003fd skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882382.56551: no more pending results, returning what we have 11124 1726882382.56556: results queue empty 11124 1726882382.56557: checking for any_errors_fatal 11124 1726882382.56569: done checking for any_errors_fatal 11124 1726882382.56570: checking for max_fail_percentage 11124 1726882382.56571: done checking for max_fail_percentage 11124 1726882382.56572: checking to see if all hosts have failed and the running result is not ok 11124 1726882382.56574: done checking to see if all hosts have failed 11124 1726882382.56574: getting the remaining hosts for this loop 11124 1726882382.56575: done getting the remaining hosts for this loop 11124 1726882382.56579: getting the next task for host managed_node1 11124 1726882382.56585: done getting next task for host managed_node1 11124 1726882382.56588: ^ task is: TASK: Get NM profile info 11124 1726882382.56592: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882382.56598: getting variables 11124 1726882382.56599: in VariableManager get_vars() 11124 1726882382.56648: Calling all_inventory to load vars for managed_node1 11124 1726882382.56653: Calling groups_inventory to load vars for managed_node1 11124 1726882382.56657: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882382.56665: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003fd 11124 1726882382.56669: WORKER PROCESS EXITING 11124 1726882382.56693: Calling all_plugins_play to load vars for managed_node1 11124 1726882382.56696: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882382.56699: Calling groups_plugins_play to load vars for managed_node1 11124 1726882382.59166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882382.63546: done with get_vars() 11124 1726882382.63591: done getting variables 11124 1726882382.63669: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:02 -0400 (0:00:00.092) 0:00:22.879 ****** 11124 1726882382.63706: entering _queue_task() for managed_node1/shell 11124 1726882382.64119: worker is 1 (out of 1 available) 11124 1726882382.64130: exiting _queue_task() for managed_node1/shell 11124 1726882382.64146: done queuing things up, now waiting for results queue to drain 11124 1726882382.64148: waiting for pending results... 11124 1726882382.64445: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11124 1726882382.64548: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003fe 11124 1726882382.64568: variable 'ansible_search_path' from source: unknown 11124 1726882382.64571: variable 'ansible_search_path' from source: unknown 11124 1726882382.64614: calling self._execute() 11124 1726882382.64722: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.64726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.64735: variable 'omit' from source: magic vars 11124 1726882382.65685: variable 'ansible_distribution_major_version' from source: facts 11124 1726882382.65733: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882382.65740: variable 'omit' from source: magic vars 11124 1726882382.65799: variable 'omit' from source: magic vars 11124 1726882382.66283: variable 'profile' from source: include params 11124 1726882382.66290: variable 'item' from source: include params 11124 1726882382.66536: variable 'item' from source: include params 11124 1726882382.66559: variable 'omit' from source: magic vars 11124 1726882382.66631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882382.66669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882382.66706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882382.66743: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882382.66757: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882382.66801: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882382.66831: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.66965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.67104: Set connection var ansible_shell_executable to /bin/sh 11124 1726882382.67112: Set connection var ansible_shell_type to sh 11124 1726882382.67120: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882382.67125: Set connection var ansible_timeout to 10 11124 1726882382.67130: Set connection var ansible_pipelining to False 11124 1726882382.67133: Set connection var ansible_connection to ssh 11124 1726882382.67306: variable 'ansible_shell_executable' from source: unknown 11124 1726882382.67309: variable 'ansible_connection' from source: unknown 11124 1726882382.67312: variable 'ansible_module_compression' from source: unknown 11124 1726882382.67314: variable 'ansible_shell_type' from source: unknown 11124 1726882382.67316: variable 'ansible_shell_executable' from source: unknown 11124 1726882382.67320: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882382.67324: variable 'ansible_pipelining' from source: unknown 11124 1726882382.67327: variable 'ansible_timeout' from source: unknown 11124 1726882382.67331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882382.67716: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882382.67732: variable 'omit' from source: magic vars 11124 1726882382.67742: starting attempt loop 11124 1726882382.67745: running the handler 11124 1726882382.67758: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882382.67780: _low_level_execute_command(): starting 11124 1726882382.67788: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882382.68383: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.68397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.68412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.68426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.68484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.68499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.68636: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.70533: stdout chunk (state=3): >>>/root <<< 11124 1726882382.70830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882382.70834: stderr chunk (state=3): >>><<< 11124 1726882382.70836: stdout chunk (state=3): >>><<< 11124 1726882382.70841: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882382.70843: _low_level_execute_command(): starting 11124 1726882382.70847: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809 `" && echo ansible-tmp-1726882382.7064357-12208-229610237889809="` echo /root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809 `" ) && sleep 0' 11124 1726882382.72297: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.72309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.72351: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.72361: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.72367: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882382.72382: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882382.72395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.72401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.72433: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.72520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.72655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.72664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.72792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.74672: stdout chunk (state=3): >>>ansible-tmp-1726882382.7064357-12208-229610237889809=/root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809 <<< 11124 1726882382.74840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882382.74843: stderr chunk (state=3): >>><<< 11124 1726882382.74848: stdout chunk (state=3): >>><<< 11124 1726882382.74874: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882382.7064357-12208-229610237889809=/root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882382.74907: variable 'ansible_module_compression' from source: unknown 11124 1726882382.74964: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882382.75006: variable 'ansible_facts' from source: unknown 11124 1726882382.75083: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809/AnsiballZ_command.py 11124 1726882382.75387: Sending initial data 11124 1726882382.75390: Sent initial data (156 bytes) 11124 1726882382.76747: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.76756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.76798: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.76801: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.76818: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 11124 1726882382.76821: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.76834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.76912: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.76916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.76931: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.77042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.78777: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882382.78872: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882382.78968: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp41r5se68 /root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809/AnsiballZ_command.py <<< 11124 1726882382.79069: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882382.80382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882382.80569: stderr chunk (state=3): >>><<< 11124 1726882382.80572: stdout chunk (state=3): >>><<< 11124 1726882382.80575: done transferring module to remote 11124 1726882382.80577: _low_level_execute_command(): starting 11124 1726882382.80579: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809/ /root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809/AnsiballZ_command.py && sleep 0' 11124 1726882382.81144: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882382.81159: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.81181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.81201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.81244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.81258: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882382.81275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.81298: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882382.81311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882382.81323: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882382.81336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.81350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.81369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.81383: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.81395: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882382.81409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.81486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.81503: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.81517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.81646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.83401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882382.83489: stderr chunk (state=3): >>><<< 11124 1726882382.83492: stdout chunk (state=3): >>><<< 11124 1726882382.83584: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882382.83587: _low_level_execute_command(): starting 11124 1726882382.83590: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809/AnsiballZ_command.py && sleep 0' 11124 1726882382.84143: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882382.84157: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.84176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.84194: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.84236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.84247: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882382.84261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.84282: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882382.84293: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882382.84303: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882382.84314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882382.84327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882382.84342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882382.84353: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882382.84367: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882382.84382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882382.84457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882382.84479: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882382.84494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882382.84631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882382.99995: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:02.975811", "end": "2024-09-20 21:33:02.998521", "delta": "0:00:00.022710", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882383.01268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882383.01327: stderr chunk (state=3): >>><<< 11124 1726882383.01330: stdout chunk (state=3): >>><<< 11124 1726882383.01370: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:33:02.975811", "end": "2024-09-20 21:33:02.998521", "delta": "0:00:00.022710", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882383.01379: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882383.01385: _low_level_execute_command(): starting 11124 1726882383.01393: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882382.7064357-12208-229610237889809/ > /dev/null 2>&1 && sleep 0' 11124 1726882383.01854: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.01858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.01877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882383.01884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.01891: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882383.01899: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882383.01904: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.01912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882383.01921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.01924: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882383.01932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.01983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882383.02013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882383.02016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882383.02109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882383.03951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882383.04014: stderr chunk (state=3): >>><<< 11124 1726882383.04016: stdout chunk (state=3): >>><<< 11124 1726882383.04080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882383.04083: handler run complete 11124 1726882383.04084: Evaluated conditional (False): False 11124 1726882383.04086: attempt loop complete, returning result 11124 1726882383.04087: _execute() done 11124 1726882383.04088: dumping result to json 11124 1726882383.04089: done dumping result, returning 11124 1726882383.04090: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-8362-0f62-0000000003fe] 11124 1726882383.04092: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003fe 11124 1726882383.04231: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003fe 11124 1726882383.04234: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.022710", "end": "2024-09-20 21:33:02.998521", "rc": 0, "start": "2024-09-20 21:33:02.975811" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11124 1726882383.04305: no more pending results, returning what we have 11124 1726882383.04308: results queue empty 11124 1726882383.04309: checking for any_errors_fatal 11124 1726882383.04314: done checking for any_errors_fatal 11124 1726882383.04315: checking for max_fail_percentage 11124 1726882383.04316: done checking for max_fail_percentage 11124 1726882383.04317: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.04318: done checking to see if all hosts have failed 11124 1726882383.04319: getting the remaining hosts for this loop 11124 1726882383.04320: done getting the remaining hosts for this loop 11124 1726882383.04323: getting the next task for host managed_node1 11124 1726882383.04328: done getting next task for host managed_node1 11124 1726882383.04330: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11124 1726882383.04334: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.04337: getting variables 11124 1726882383.04339: in VariableManager get_vars() 11124 1726882383.04377: Calling all_inventory to load vars for managed_node1 11124 1726882383.04380: Calling groups_inventory to load vars for managed_node1 11124 1726882383.04382: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.04392: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.04394: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.04396: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.05334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.06804: done with get_vars() 11124 1726882383.06828: done getting variables 11124 1726882383.06910: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:03 -0400 (0:00:00.432) 0:00:23.311 ****** 11124 1726882383.06942: entering _queue_task() for managed_node1/set_fact 11124 1726882383.07296: worker is 1 (out of 1 available) 11124 1726882383.07308: exiting _queue_task() for managed_node1/set_fact 11124 1726882383.07328: done queuing things up, now waiting for results queue to drain 11124 1726882383.07330: waiting for pending results... 11124 1726882383.07633: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11124 1726882383.07747: in run() - task 0e448fcc-3ce9-8362-0f62-0000000003ff 11124 1726882383.07768: variable 'ansible_search_path' from source: unknown 11124 1726882383.07772: variable 'ansible_search_path' from source: unknown 11124 1726882383.07808: calling self._execute() 11124 1726882383.07901: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.07905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.07913: variable 'omit' from source: magic vars 11124 1726882383.08194: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.08205: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.08298: variable 'nm_profile_exists' from source: set_fact 11124 1726882383.08310: Evaluated conditional (nm_profile_exists.rc == 0): True 11124 1726882383.08317: variable 'omit' from source: magic vars 11124 1726882383.08347: variable 'omit' from source: magic vars 11124 1726882383.08372: variable 'omit' from source: magic vars 11124 1726882383.08406: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882383.08433: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882383.08452: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882383.08466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.08477: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.08501: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882383.08505: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.08507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.08579: Set connection var ansible_shell_executable to /bin/sh 11124 1726882383.08586: Set connection var ansible_shell_type to sh 11124 1726882383.08592: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882383.08598: Set connection var ansible_timeout to 10 11124 1726882383.08602: Set connection var ansible_pipelining to False 11124 1726882383.08605: Set connection var ansible_connection to ssh 11124 1726882383.08621: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.08624: variable 'ansible_connection' from source: unknown 11124 1726882383.08629: variable 'ansible_module_compression' from source: unknown 11124 1726882383.08632: variable 'ansible_shell_type' from source: unknown 11124 1726882383.08634: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.08636: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.08639: variable 'ansible_pipelining' from source: unknown 11124 1726882383.08641: variable 'ansible_timeout' from source: unknown 11124 1726882383.08644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.08744: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882383.08756: variable 'omit' from source: magic vars 11124 1726882383.08759: starting attempt loop 11124 1726882383.08762: running the handler 11124 1726882383.08774: handler run complete 11124 1726882383.08782: attempt loop complete, returning result 11124 1726882383.08785: _execute() done 11124 1726882383.08787: dumping result to json 11124 1726882383.08789: done dumping result, returning 11124 1726882383.08797: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-8362-0f62-0000000003ff] 11124 1726882383.08802: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003ff ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11124 1726882383.08937: no more pending results, returning what we have 11124 1726882383.08940: results queue empty 11124 1726882383.08941: checking for any_errors_fatal 11124 1726882383.08952: done checking for any_errors_fatal 11124 1726882383.08952: checking for max_fail_percentage 11124 1726882383.08954: done checking for max_fail_percentage 11124 1726882383.08955: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.08956: done checking to see if all hosts have failed 11124 1726882383.08957: getting the remaining hosts for this loop 11124 1726882383.08958: done getting the remaining hosts for this loop 11124 1726882383.08961: getting the next task for host managed_node1 11124 1726882383.08971: done getting next task for host managed_node1 11124 1726882383.08973: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11124 1726882383.08978: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.08983: getting variables 11124 1726882383.08986: in VariableManager get_vars() 11124 1726882383.09023: Calling all_inventory to load vars for managed_node1 11124 1726882383.09026: Calling groups_inventory to load vars for managed_node1 11124 1726882383.09028: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.09038: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.09040: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.09043: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.09955: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000003ff 11124 1726882383.09959: WORKER PROCESS EXITING 11124 1726882383.09976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.11749: done with get_vars() 11124 1726882383.11779: done getting variables 11124 1726882383.11840: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882383.11967: variable 'profile' from source: include params 11124 1726882383.11970: variable 'item' from source: include params 11124 1726882383.12014: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:03 -0400 (0:00:00.050) 0:00:23.362 ****** 11124 1726882383.12041: entering _queue_task() for managed_node1/command 11124 1726882383.12324: worker is 1 (out of 1 available) 11124 1726882383.12337: exiting _queue_task() for managed_node1/command 11124 1726882383.12353: done queuing things up, now waiting for results queue to drain 11124 1726882383.12355: waiting for pending results... 11124 1726882383.12537: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11124 1726882383.12771: in run() - task 0e448fcc-3ce9-8362-0f62-000000000401 11124 1726882383.12775: variable 'ansible_search_path' from source: unknown 11124 1726882383.12778: variable 'ansible_search_path' from source: unknown 11124 1726882383.12781: calling self._execute() 11124 1726882383.12828: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.12839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.12855: variable 'omit' from source: magic vars 11124 1726882383.13240: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.13254: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.13375: variable 'profile_stat' from source: set_fact 11124 1726882383.13387: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882383.13391: when evaluation is False, skipping this task 11124 1726882383.13394: _execute() done 11124 1726882383.13396: dumping result to json 11124 1726882383.13398: done dumping result, returning 11124 1726882383.13406: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0e448fcc-3ce9-8362-0f62-000000000401] 11124 1726882383.13411: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000401 11124 1726882383.13504: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000401 11124 1726882383.13506: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882383.13555: no more pending results, returning what we have 11124 1726882383.13559: results queue empty 11124 1726882383.13560: checking for any_errors_fatal 11124 1726882383.13569: done checking for any_errors_fatal 11124 1726882383.13570: checking for max_fail_percentage 11124 1726882383.13572: done checking for max_fail_percentage 11124 1726882383.13573: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.13575: done checking to see if all hosts have failed 11124 1726882383.13576: getting the remaining hosts for this loop 11124 1726882383.13577: done getting the remaining hosts for this loop 11124 1726882383.13580: getting the next task for host managed_node1 11124 1726882383.13586: done getting next task for host managed_node1 11124 1726882383.13588: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11124 1726882383.13592: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.13596: getting variables 11124 1726882383.13597: in VariableManager get_vars() 11124 1726882383.13632: Calling all_inventory to load vars for managed_node1 11124 1726882383.13634: Calling groups_inventory to load vars for managed_node1 11124 1726882383.13636: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.13647: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.13651: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.13654: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.14825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.15756: done with get_vars() 11124 1726882383.15772: done getting variables 11124 1726882383.15813: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882383.15891: variable 'profile' from source: include params 11124 1726882383.15894: variable 'item' from source: include params 11124 1726882383.15931: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:03 -0400 (0:00:00.039) 0:00:23.402 ****** 11124 1726882383.15954: entering _queue_task() for managed_node1/set_fact 11124 1726882383.16202: worker is 1 (out of 1 available) 11124 1726882383.16213: exiting _queue_task() for managed_node1/set_fact 11124 1726882383.16225: done queuing things up, now waiting for results queue to drain 11124 1726882383.16227: waiting for pending results... 11124 1726882383.16493: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11124 1726882383.16607: in run() - task 0e448fcc-3ce9-8362-0f62-000000000402 11124 1726882383.16618: variable 'ansible_search_path' from source: unknown 11124 1726882383.16622: variable 'ansible_search_path' from source: unknown 11124 1726882383.16657: calling self._execute() 11124 1726882383.16753: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.16756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.16769: variable 'omit' from source: magic vars 11124 1726882383.17116: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.17130: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.17248: variable 'profile_stat' from source: set_fact 11124 1726882383.17260: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882383.17263: when evaluation is False, skipping this task 11124 1726882383.17269: _execute() done 11124 1726882383.17271: dumping result to json 11124 1726882383.17274: done dumping result, returning 11124 1726882383.17277: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0e448fcc-3ce9-8362-0f62-000000000402] 11124 1726882383.17280: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000402 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882383.17420: no more pending results, returning what we have 11124 1726882383.17424: results queue empty 11124 1726882383.17425: checking for any_errors_fatal 11124 1726882383.17432: done checking for any_errors_fatal 11124 1726882383.17433: checking for max_fail_percentage 11124 1726882383.17434: done checking for max_fail_percentage 11124 1726882383.17435: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.17437: done checking to see if all hosts have failed 11124 1726882383.17437: getting the remaining hosts for this loop 11124 1726882383.17438: done getting the remaining hosts for this loop 11124 1726882383.17442: getting the next task for host managed_node1 11124 1726882383.17448: done getting next task for host managed_node1 11124 1726882383.17451: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11124 1726882383.17456: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.17462: getting variables 11124 1726882383.17466: in VariableManager get_vars() 11124 1726882383.17509: Calling all_inventory to load vars for managed_node1 11124 1726882383.17512: Calling groups_inventory to load vars for managed_node1 11124 1726882383.17515: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.17532: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.17536: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.17540: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.18060: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000402 11124 1726882383.18064: WORKER PROCESS EXITING 11124 1726882383.18468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.19499: done with get_vars() 11124 1726882383.19513: done getting variables 11124 1726882383.19555: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882383.19630: variable 'profile' from source: include params 11124 1726882383.19632: variable 'item' from source: include params 11124 1726882383.19675: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:03 -0400 (0:00:00.037) 0:00:23.439 ****** 11124 1726882383.19697: entering _queue_task() for managed_node1/command 11124 1726882383.19903: worker is 1 (out of 1 available) 11124 1726882383.19917: exiting _queue_task() for managed_node1/command 11124 1726882383.19928: done queuing things up, now waiting for results queue to drain 11124 1726882383.19930: waiting for pending results... 11124 1726882383.20102: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 11124 1726882383.20177: in run() - task 0e448fcc-3ce9-8362-0f62-000000000403 11124 1726882383.20187: variable 'ansible_search_path' from source: unknown 11124 1726882383.20191: variable 'ansible_search_path' from source: unknown 11124 1726882383.20219: calling self._execute() 11124 1726882383.20291: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.20295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.20304: variable 'omit' from source: magic vars 11124 1726882383.20560: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.20570: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.20656: variable 'profile_stat' from source: set_fact 11124 1726882383.20665: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882383.20668: when evaluation is False, skipping this task 11124 1726882383.20672: _execute() done 11124 1726882383.20675: dumping result to json 11124 1726882383.20677: done dumping result, returning 11124 1726882383.20683: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0e448fcc-3ce9-8362-0f62-000000000403] 11124 1726882383.20688: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000403 11124 1726882383.20776: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000403 11124 1726882383.20779: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882383.20846: no more pending results, returning what we have 11124 1726882383.20852: results queue empty 11124 1726882383.20853: checking for any_errors_fatal 11124 1726882383.20858: done checking for any_errors_fatal 11124 1726882383.20859: checking for max_fail_percentage 11124 1726882383.20860: done checking for max_fail_percentage 11124 1726882383.20861: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.20862: done checking to see if all hosts have failed 11124 1726882383.20863: getting the remaining hosts for this loop 11124 1726882383.20865: done getting the remaining hosts for this loop 11124 1726882383.20868: getting the next task for host managed_node1 11124 1726882383.20873: done getting next task for host managed_node1 11124 1726882383.20875: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11124 1726882383.20884: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.20888: getting variables 11124 1726882383.20889: in VariableManager get_vars() 11124 1726882383.20921: Calling all_inventory to load vars for managed_node1 11124 1726882383.20923: Calling groups_inventory to load vars for managed_node1 11124 1726882383.20926: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.20934: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.20936: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.20938: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.21708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.22654: done with get_vars() 11124 1726882383.22672: done getting variables 11124 1726882383.22714: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882383.22791: variable 'profile' from source: include params 11124 1726882383.22794: variable 'item' from source: include params 11124 1726882383.22831: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:03 -0400 (0:00:00.031) 0:00:23.471 ****** 11124 1726882383.22856: entering _queue_task() for managed_node1/set_fact 11124 1726882383.23065: worker is 1 (out of 1 available) 11124 1726882383.23077: exiting _queue_task() for managed_node1/set_fact 11124 1726882383.23090: done queuing things up, now waiting for results queue to drain 11124 1726882383.23092: waiting for pending results... 11124 1726882383.23258: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11124 1726882383.23328: in run() - task 0e448fcc-3ce9-8362-0f62-000000000404 11124 1726882383.23338: variable 'ansible_search_path' from source: unknown 11124 1726882383.23342: variable 'ansible_search_path' from source: unknown 11124 1726882383.23373: calling self._execute() 11124 1726882383.23441: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.23445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.23455: variable 'omit' from source: magic vars 11124 1726882383.23699: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.23709: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.23793: variable 'profile_stat' from source: set_fact 11124 1726882383.23803: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882383.23807: when evaluation is False, skipping this task 11124 1726882383.23811: _execute() done 11124 1726882383.23814: dumping result to json 11124 1726882383.23816: done dumping result, returning 11124 1726882383.23819: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0e448fcc-3ce9-8362-0f62-000000000404] 11124 1726882383.23826: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000404 11124 1726882383.23908: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000404 11124 1726882383.23911: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882383.23966: no more pending results, returning what we have 11124 1726882383.23969: results queue empty 11124 1726882383.23970: checking for any_errors_fatal 11124 1726882383.23976: done checking for any_errors_fatal 11124 1726882383.23977: checking for max_fail_percentage 11124 1726882383.23978: done checking for max_fail_percentage 11124 1726882383.23979: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.23980: done checking to see if all hosts have failed 11124 1726882383.23981: getting the remaining hosts for this loop 11124 1726882383.23982: done getting the remaining hosts for this loop 11124 1726882383.23984: getting the next task for host managed_node1 11124 1726882383.23990: done getting next task for host managed_node1 11124 1726882383.23993: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11124 1726882383.23996: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.23999: getting variables 11124 1726882383.24000: in VariableManager get_vars() 11124 1726882383.24033: Calling all_inventory to load vars for managed_node1 11124 1726882383.24035: Calling groups_inventory to load vars for managed_node1 11124 1726882383.24044: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.24055: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.24057: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.24059: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.24933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.25858: done with get_vars() 11124 1726882383.25876: done getting variables 11124 1726882383.25917: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882383.25994: variable 'profile' from source: include params 11124 1726882383.25998: variable 'item' from source: include params 11124 1726882383.26035: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:03 -0400 (0:00:00.031) 0:00:23.503 ****** 11124 1726882383.26058: entering _queue_task() for managed_node1/assert 11124 1726882383.26259: worker is 1 (out of 1 available) 11124 1726882383.26274: exiting _queue_task() for managed_node1/assert 11124 1726882383.26285: done queuing things up, now waiting for results queue to drain 11124 1726882383.26287: waiting for pending results... 11124 1726882383.26444: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' 11124 1726882383.26509: in run() - task 0e448fcc-3ce9-8362-0f62-000000000268 11124 1726882383.26518: variable 'ansible_search_path' from source: unknown 11124 1726882383.26521: variable 'ansible_search_path' from source: unknown 11124 1726882383.26556: calling self._execute() 11124 1726882383.26624: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.26628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.26640: variable 'omit' from source: magic vars 11124 1726882383.26885: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.26895: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.26901: variable 'omit' from source: magic vars 11124 1726882383.26926: variable 'omit' from source: magic vars 11124 1726882383.26993: variable 'profile' from source: include params 11124 1726882383.26998: variable 'item' from source: include params 11124 1726882383.27042: variable 'item' from source: include params 11124 1726882383.27056: variable 'omit' from source: magic vars 11124 1726882383.27092: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882383.27117: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882383.27134: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882383.27147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.27157: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.27183: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882383.27186: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.27189: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.27255: Set connection var ansible_shell_executable to /bin/sh 11124 1726882383.27261: Set connection var ansible_shell_type to sh 11124 1726882383.27270: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882383.27275: Set connection var ansible_timeout to 10 11124 1726882383.27280: Set connection var ansible_pipelining to False 11124 1726882383.27284: Set connection var ansible_connection to ssh 11124 1726882383.27299: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.27302: variable 'ansible_connection' from source: unknown 11124 1726882383.27305: variable 'ansible_module_compression' from source: unknown 11124 1726882383.27307: variable 'ansible_shell_type' from source: unknown 11124 1726882383.27309: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.27311: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.27315: variable 'ansible_pipelining' from source: unknown 11124 1726882383.27318: variable 'ansible_timeout' from source: unknown 11124 1726882383.27322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.27421: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882383.27429: variable 'omit' from source: magic vars 11124 1726882383.27435: starting attempt loop 11124 1726882383.27438: running the handler 11124 1726882383.27512: variable 'lsr_net_profile_exists' from source: set_fact 11124 1726882383.27517: Evaluated conditional (lsr_net_profile_exists): True 11124 1726882383.27522: handler run complete 11124 1726882383.27534: attempt loop complete, returning result 11124 1726882383.27537: _execute() done 11124 1726882383.27539: dumping result to json 11124 1726882383.27542: done dumping result, returning 11124 1726882383.27554: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' [0e448fcc-3ce9-8362-0f62-000000000268] 11124 1726882383.27557: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000268 11124 1726882383.27635: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000268 11124 1726882383.27638: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882383.27690: no more pending results, returning what we have 11124 1726882383.27693: results queue empty 11124 1726882383.27694: checking for any_errors_fatal 11124 1726882383.27701: done checking for any_errors_fatal 11124 1726882383.27701: checking for max_fail_percentage 11124 1726882383.27703: done checking for max_fail_percentage 11124 1726882383.27704: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.27705: done checking to see if all hosts have failed 11124 1726882383.27706: getting the remaining hosts for this loop 11124 1726882383.27707: done getting the remaining hosts for this loop 11124 1726882383.27710: getting the next task for host managed_node1 11124 1726882383.27715: done getting next task for host managed_node1 11124 1726882383.27718: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11124 1726882383.27725: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.27729: getting variables 11124 1726882383.27730: in VariableManager get_vars() 11124 1726882383.27770: Calling all_inventory to load vars for managed_node1 11124 1726882383.27772: Calling groups_inventory to load vars for managed_node1 11124 1726882383.27775: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.27784: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.27786: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.27789: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.28580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.29612: done with get_vars() 11124 1726882383.29627: done getting variables 11124 1726882383.29671: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882383.29747: variable 'profile' from source: include params 11124 1726882383.29752: variable 'item' from source: include params 11124 1726882383.29792: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:03 -0400 (0:00:00.037) 0:00:23.540 ****** 11124 1726882383.29819: entering _queue_task() for managed_node1/assert 11124 1726882383.30024: worker is 1 (out of 1 available) 11124 1726882383.30037: exiting _queue_task() for managed_node1/assert 11124 1726882383.30052: done queuing things up, now waiting for results queue to drain 11124 1726882383.30054: waiting for pending results... 11124 1726882383.30221: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11124 1726882383.30287: in run() - task 0e448fcc-3ce9-8362-0f62-000000000269 11124 1726882383.30298: variable 'ansible_search_path' from source: unknown 11124 1726882383.30302: variable 'ansible_search_path' from source: unknown 11124 1726882383.30330: calling self._execute() 11124 1726882383.30403: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.30406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.30414: variable 'omit' from source: magic vars 11124 1726882383.30668: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.30678: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.30687: variable 'omit' from source: magic vars 11124 1726882383.30714: variable 'omit' from source: magic vars 11124 1726882383.30782: variable 'profile' from source: include params 11124 1726882383.30785: variable 'item' from source: include params 11124 1726882383.30831: variable 'item' from source: include params 11124 1726882383.30845: variable 'omit' from source: magic vars 11124 1726882383.30880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882383.30905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882383.30919: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882383.30933: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.30942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.30967: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882383.30970: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.30972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.31041: Set connection var ansible_shell_executable to /bin/sh 11124 1726882383.31047: Set connection var ansible_shell_type to sh 11124 1726882383.31055: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882383.31060: Set connection var ansible_timeout to 10 11124 1726882383.31066: Set connection var ansible_pipelining to False 11124 1726882383.31069: Set connection var ansible_connection to ssh 11124 1726882383.31085: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.31087: variable 'ansible_connection' from source: unknown 11124 1726882383.31090: variable 'ansible_module_compression' from source: unknown 11124 1726882383.31093: variable 'ansible_shell_type' from source: unknown 11124 1726882383.31095: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.31098: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.31100: variable 'ansible_pipelining' from source: unknown 11124 1726882383.31102: variable 'ansible_timeout' from source: unknown 11124 1726882383.31107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.31205: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882383.31216: variable 'omit' from source: magic vars 11124 1726882383.31219: starting attempt loop 11124 1726882383.31222: running the handler 11124 1726882383.31297: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11124 1726882383.31300: Evaluated conditional (lsr_net_profile_ansible_managed): True 11124 1726882383.31307: handler run complete 11124 1726882383.31318: attempt loop complete, returning result 11124 1726882383.31322: _execute() done 11124 1726882383.31324: dumping result to json 11124 1726882383.31327: done dumping result, returning 11124 1726882383.31334: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0e448fcc-3ce9-8362-0f62-000000000269] 11124 1726882383.31336: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000269 11124 1726882383.31418: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000269 11124 1726882383.31421: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882383.31478: no more pending results, returning what we have 11124 1726882383.31480: results queue empty 11124 1726882383.31481: checking for any_errors_fatal 11124 1726882383.31488: done checking for any_errors_fatal 11124 1726882383.31488: checking for max_fail_percentage 11124 1726882383.31490: done checking for max_fail_percentage 11124 1726882383.31491: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.31492: done checking to see if all hosts have failed 11124 1726882383.31493: getting the remaining hosts for this loop 11124 1726882383.31494: done getting the remaining hosts for this loop 11124 1726882383.31497: getting the next task for host managed_node1 11124 1726882383.31502: done getting next task for host managed_node1 11124 1726882383.31505: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11124 1726882383.31508: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.31511: getting variables 11124 1726882383.31513: in VariableManager get_vars() 11124 1726882383.31559: Calling all_inventory to load vars for managed_node1 11124 1726882383.31561: Calling groups_inventory to load vars for managed_node1 11124 1726882383.31564: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.31575: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.31577: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.31579: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.32386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.33322: done with get_vars() 11124 1726882383.33338: done getting variables 11124 1726882383.33387: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882383.33469: variable 'profile' from source: include params 11124 1726882383.33472: variable 'item' from source: include params 11124 1726882383.33513: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:03 -0400 (0:00:00.037) 0:00:23.577 ****** 11124 1726882383.33539: entering _queue_task() for managed_node1/assert 11124 1726882383.33770: worker is 1 (out of 1 available) 11124 1726882383.33783: exiting _queue_task() for managed_node1/assert 11124 1726882383.33797: done queuing things up, now waiting for results queue to drain 11124 1726882383.33798: waiting for pending results... 11124 1726882383.33969: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 11124 1726882383.34030: in run() - task 0e448fcc-3ce9-8362-0f62-00000000026a 11124 1726882383.34042: variable 'ansible_search_path' from source: unknown 11124 1726882383.34047: variable 'ansible_search_path' from source: unknown 11124 1726882383.34077: calling self._execute() 11124 1726882383.34148: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.34154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.34165: variable 'omit' from source: magic vars 11124 1726882383.34414: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.34423: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.34429: variable 'omit' from source: magic vars 11124 1726882383.34455: variable 'omit' from source: magic vars 11124 1726882383.34525: variable 'profile' from source: include params 11124 1726882383.34529: variable 'item' from source: include params 11124 1726882383.34576: variable 'item' from source: include params 11124 1726882383.34592: variable 'omit' from source: magic vars 11124 1726882383.34623: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882383.34647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882383.34668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882383.34681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.34691: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.34716: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882383.34719: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.34721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.34790: Set connection var ansible_shell_executable to /bin/sh 11124 1726882383.34798: Set connection var ansible_shell_type to sh 11124 1726882383.34805: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882383.34810: Set connection var ansible_timeout to 10 11124 1726882383.34814: Set connection var ansible_pipelining to False 11124 1726882383.34817: Set connection var ansible_connection to ssh 11124 1726882383.34833: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.34835: variable 'ansible_connection' from source: unknown 11124 1726882383.34838: variable 'ansible_module_compression' from source: unknown 11124 1726882383.34840: variable 'ansible_shell_type' from source: unknown 11124 1726882383.34842: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.34844: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.34848: variable 'ansible_pipelining' from source: unknown 11124 1726882383.34854: variable 'ansible_timeout' from source: unknown 11124 1726882383.34856: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.34956: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882383.34963: variable 'omit' from source: magic vars 11124 1726882383.34969: starting attempt loop 11124 1726882383.34971: running the handler 11124 1726882383.35047: variable 'lsr_net_profile_fingerprint' from source: set_fact 11124 1726882383.35053: Evaluated conditional (lsr_net_profile_fingerprint): True 11124 1726882383.35056: handler run complete 11124 1726882383.35071: attempt loop complete, returning result 11124 1726882383.35074: _execute() done 11124 1726882383.35076: dumping result to json 11124 1726882383.35079: done dumping result, returning 11124 1726882383.35085: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 [0e448fcc-3ce9-8362-0f62-00000000026a] 11124 1726882383.35089: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000026a 11124 1726882383.35178: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000026a 11124 1726882383.35181: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882383.35225: no more pending results, returning what we have 11124 1726882383.35232: results queue empty 11124 1726882383.35233: checking for any_errors_fatal 11124 1726882383.35242: done checking for any_errors_fatal 11124 1726882383.35243: checking for max_fail_percentage 11124 1726882383.35244: done checking for max_fail_percentage 11124 1726882383.35245: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.35246: done checking to see if all hosts have failed 11124 1726882383.35247: getting the remaining hosts for this loop 11124 1726882383.35248: done getting the remaining hosts for this loop 11124 1726882383.35254: getting the next task for host managed_node1 11124 1726882383.35262: done getting next task for host managed_node1 11124 1726882383.35266: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11124 1726882383.35269: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.35273: getting variables 11124 1726882383.35275: in VariableManager get_vars() 11124 1726882383.35309: Calling all_inventory to load vars for managed_node1 11124 1726882383.35312: Calling groups_inventory to load vars for managed_node1 11124 1726882383.35314: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.35324: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.35326: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.35328: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.36218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.37135: done with get_vars() 11124 1726882383.37152: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:33:03 -0400 (0:00:00.036) 0:00:23.614 ****** 11124 1726882383.37222: entering _queue_task() for managed_node1/include_tasks 11124 1726882383.37448: worker is 1 (out of 1 available) 11124 1726882383.37462: exiting _queue_task() for managed_node1/include_tasks 11124 1726882383.37476: done queuing things up, now waiting for results queue to drain 11124 1726882383.37478: waiting for pending results... 11124 1726882383.37648: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11124 1726882383.37731: in run() - task 0e448fcc-3ce9-8362-0f62-00000000026e 11124 1726882383.37742: variable 'ansible_search_path' from source: unknown 11124 1726882383.37746: variable 'ansible_search_path' from source: unknown 11124 1726882383.37777: calling self._execute() 11124 1726882383.37850: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.37857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.37866: variable 'omit' from source: magic vars 11124 1726882383.38133: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.38143: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.38150: _execute() done 11124 1726882383.38158: dumping result to json 11124 1726882383.38161: done dumping result, returning 11124 1726882383.38165: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0e448fcc-3ce9-8362-0f62-00000000026e] 11124 1726882383.38171: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000026e 11124 1726882383.38254: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000026e 11124 1726882383.38257: WORKER PROCESS EXITING 11124 1726882383.38289: no more pending results, returning what we have 11124 1726882383.38293: in VariableManager get_vars() 11124 1726882383.38340: Calling all_inventory to load vars for managed_node1 11124 1726882383.38343: Calling groups_inventory to load vars for managed_node1 11124 1726882383.38345: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.38359: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.38362: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.38370: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.39189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.40202: done with get_vars() 11124 1726882383.40218: variable 'ansible_search_path' from source: unknown 11124 1726882383.40219: variable 'ansible_search_path' from source: unknown 11124 1726882383.40247: we have included files to process 11124 1726882383.40248: generating all_blocks data 11124 1726882383.40249: done generating all_blocks data 11124 1726882383.40253: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11124 1726882383.40254: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11124 1726882383.40255: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11124 1726882383.40848: done processing included file 11124 1726882383.40850: iterating over new_blocks loaded from include file 11124 1726882383.40852: in VariableManager get_vars() 11124 1726882383.40869: done with get_vars() 11124 1726882383.40871: filtering new block on tags 11124 1726882383.40887: done filtering new block on tags 11124 1726882383.40889: in VariableManager get_vars() 11124 1726882383.40901: done with get_vars() 11124 1726882383.40902: filtering new block on tags 11124 1726882383.40915: done filtering new block on tags 11124 1726882383.40916: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11124 1726882383.40920: extending task lists for all hosts with included blocks 11124 1726882383.41027: done extending task lists 11124 1726882383.41028: done processing included files 11124 1726882383.41028: results queue empty 11124 1726882383.41029: checking for any_errors_fatal 11124 1726882383.41030: done checking for any_errors_fatal 11124 1726882383.41031: checking for max_fail_percentage 11124 1726882383.41032: done checking for max_fail_percentage 11124 1726882383.41032: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.41033: done checking to see if all hosts have failed 11124 1726882383.41033: getting the remaining hosts for this loop 11124 1726882383.41034: done getting the remaining hosts for this loop 11124 1726882383.41036: getting the next task for host managed_node1 11124 1726882383.41038: done getting next task for host managed_node1 11124 1726882383.41039: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11124 1726882383.41042: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.41043: getting variables 11124 1726882383.41044: in VariableManager get_vars() 11124 1726882383.41054: Calling all_inventory to load vars for managed_node1 11124 1726882383.41056: Calling groups_inventory to load vars for managed_node1 11124 1726882383.41057: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.41062: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.41072: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.41075: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.41780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.42690: done with get_vars() 11124 1726882383.42709: done getting variables 11124 1726882383.42741: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:33:03 -0400 (0:00:00.055) 0:00:23.670 ****** 11124 1726882383.42767: entering _queue_task() for managed_node1/set_fact 11124 1726882383.43015: worker is 1 (out of 1 available) 11124 1726882383.43029: exiting _queue_task() for managed_node1/set_fact 11124 1726882383.43041: done queuing things up, now waiting for results queue to drain 11124 1726882383.43043: waiting for pending results... 11124 1726882383.43223: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11124 1726882383.43297: in run() - task 0e448fcc-3ce9-8362-0f62-000000000443 11124 1726882383.43308: variable 'ansible_search_path' from source: unknown 11124 1726882383.43311: variable 'ansible_search_path' from source: unknown 11124 1726882383.43341: calling self._execute() 11124 1726882383.43416: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.43420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.43428: variable 'omit' from source: magic vars 11124 1726882383.43704: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.43714: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.43721: variable 'omit' from source: magic vars 11124 1726882383.43750: variable 'omit' from source: magic vars 11124 1726882383.43779: variable 'omit' from source: magic vars 11124 1726882383.43814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882383.43841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882383.43860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882383.43875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.43884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.43910: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882383.43914: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.43916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.43986: Set connection var ansible_shell_executable to /bin/sh 11124 1726882383.43992: Set connection var ansible_shell_type to sh 11124 1726882383.44000: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882383.44003: Set connection var ansible_timeout to 10 11124 1726882383.44009: Set connection var ansible_pipelining to False 11124 1726882383.44012: Set connection var ansible_connection to ssh 11124 1726882383.44032: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.44036: variable 'ansible_connection' from source: unknown 11124 1726882383.44039: variable 'ansible_module_compression' from source: unknown 11124 1726882383.44042: variable 'ansible_shell_type' from source: unknown 11124 1726882383.44044: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.44046: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.44049: variable 'ansible_pipelining' from source: unknown 11124 1726882383.44051: variable 'ansible_timeout' from source: unknown 11124 1726882383.44053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.44156: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882383.44165: variable 'omit' from source: magic vars 11124 1726882383.44173: starting attempt loop 11124 1726882383.44176: running the handler 11124 1726882383.44186: handler run complete 11124 1726882383.44194: attempt loop complete, returning result 11124 1726882383.44197: _execute() done 11124 1726882383.44199: dumping result to json 11124 1726882383.44202: done dumping result, returning 11124 1726882383.44209: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0e448fcc-3ce9-8362-0f62-000000000443] 11124 1726882383.44213: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000443 11124 1726882383.44296: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000443 11124 1726882383.44299: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11124 1726882383.44354: no more pending results, returning what we have 11124 1726882383.44357: results queue empty 11124 1726882383.44358: checking for any_errors_fatal 11124 1726882383.44360: done checking for any_errors_fatal 11124 1726882383.44360: checking for max_fail_percentage 11124 1726882383.44362: done checking for max_fail_percentage 11124 1726882383.44363: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.44365: done checking to see if all hosts have failed 11124 1726882383.44366: getting the remaining hosts for this loop 11124 1726882383.44367: done getting the remaining hosts for this loop 11124 1726882383.44370: getting the next task for host managed_node1 11124 1726882383.44381: done getting next task for host managed_node1 11124 1726882383.44383: ^ task is: TASK: Stat profile file 11124 1726882383.44388: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.44392: getting variables 11124 1726882383.44394: in VariableManager get_vars() 11124 1726882383.44432: Calling all_inventory to load vars for managed_node1 11124 1726882383.44435: Calling groups_inventory to load vars for managed_node1 11124 1726882383.44438: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.44452: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.44454: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.44456: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.45334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.46253: done with get_vars() 11124 1726882383.46271: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:33:03 -0400 (0:00:00.035) 0:00:23.705 ****** 11124 1726882383.46338: entering _queue_task() for managed_node1/stat 11124 1726882383.46565: worker is 1 (out of 1 available) 11124 1726882383.46579: exiting _queue_task() for managed_node1/stat 11124 1726882383.46591: done queuing things up, now waiting for results queue to drain 11124 1726882383.46593: waiting for pending results... 11124 1726882383.46768: running TaskExecutor() for managed_node1/TASK: Stat profile file 11124 1726882383.46836: in run() - task 0e448fcc-3ce9-8362-0f62-000000000444 11124 1726882383.46848: variable 'ansible_search_path' from source: unknown 11124 1726882383.46852: variable 'ansible_search_path' from source: unknown 11124 1726882383.46885: calling self._execute() 11124 1726882383.46960: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.46965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.46974: variable 'omit' from source: magic vars 11124 1726882383.47242: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.47254: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.47262: variable 'omit' from source: magic vars 11124 1726882383.47294: variable 'omit' from source: magic vars 11124 1726882383.47365: variable 'profile' from source: include params 11124 1726882383.47369: variable 'item' from source: include params 11124 1726882383.47415: variable 'item' from source: include params 11124 1726882383.47430: variable 'omit' from source: magic vars 11124 1726882383.47466: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882383.47494: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882383.47510: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882383.47522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.47535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.47558: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882383.47561: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.47565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.47633: Set connection var ansible_shell_executable to /bin/sh 11124 1726882383.47643: Set connection var ansible_shell_type to sh 11124 1726882383.47648: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882383.47655: Set connection var ansible_timeout to 10 11124 1726882383.47660: Set connection var ansible_pipelining to False 11124 1726882383.47665: Set connection var ansible_connection to ssh 11124 1726882383.47681: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.47685: variable 'ansible_connection' from source: unknown 11124 1726882383.47687: variable 'ansible_module_compression' from source: unknown 11124 1726882383.47689: variable 'ansible_shell_type' from source: unknown 11124 1726882383.47692: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.47695: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.47697: variable 'ansible_pipelining' from source: unknown 11124 1726882383.47700: variable 'ansible_timeout' from source: unknown 11124 1726882383.47702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.47849: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882383.47861: variable 'omit' from source: magic vars 11124 1726882383.47864: starting attempt loop 11124 1726882383.47869: running the handler 11124 1726882383.47882: _low_level_execute_command(): starting 11124 1726882383.47888: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882383.48417: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.48447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.48462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.48518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882383.48524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882383.48536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882383.48647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882383.50306: stdout chunk (state=3): >>>/root <<< 11124 1726882383.50411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882383.50465: stderr chunk (state=3): >>><<< 11124 1726882383.50469: stdout chunk (state=3): >>><<< 11124 1726882383.50488: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882383.50499: _low_level_execute_command(): starting 11124 1726882383.50506: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030 `" && echo ansible-tmp-1726882383.50488-12255-31522467970030="` echo /root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030 `" ) && sleep 0' 11124 1726882383.50945: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.50962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.50989: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882383.51009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.51063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882383.51078: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882383.51174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882383.53054: stdout chunk (state=3): >>>ansible-tmp-1726882383.50488-12255-31522467970030=/root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030 <<< 11124 1726882383.53168: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882383.53218: stderr chunk (state=3): >>><<< 11124 1726882383.53221: stdout chunk (state=3): >>><<< 11124 1726882383.53240: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882383.50488-12255-31522467970030=/root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882383.53281: variable 'ansible_module_compression' from source: unknown 11124 1726882383.53332: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11124 1726882383.53360: variable 'ansible_facts' from source: unknown 11124 1726882383.53421: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030/AnsiballZ_stat.py 11124 1726882383.53534: Sending initial data 11124 1726882383.53538: Sent initial data (150 bytes) 11124 1726882383.54213: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.54235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.54248: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.54302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882383.54325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882383.54417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882383.56172: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882383.56269: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882383.56364: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmppsv5uok6 /root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030/AnsiballZ_stat.py <<< 11124 1726882383.56455: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882383.57441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882383.57541: stderr chunk (state=3): >>><<< 11124 1726882383.57544: stdout chunk (state=3): >>><<< 11124 1726882383.57565: done transferring module to remote 11124 1726882383.57575: _low_level_execute_command(): starting 11124 1726882383.57580: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030/ /root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030/AnsiballZ_stat.py && sleep 0' 11124 1726882383.58019: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.58025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.58055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.58073: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.58085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.58132: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882383.58137: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882383.58247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882383.60024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882383.60076: stderr chunk (state=3): >>><<< 11124 1726882383.60084: stdout chunk (state=3): >>><<< 11124 1726882383.60095: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882383.60097: _low_level_execute_command(): starting 11124 1726882383.60103: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030/AnsiballZ_stat.py && sleep 0' 11124 1726882383.60555: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.60574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.60587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882383.60599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882383.60610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.60663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882383.60670: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882383.60789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882383.74599: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11124 1726882383.75701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882383.75705: stdout chunk (state=3): >>><<< 11124 1726882383.75710: stderr chunk (state=3): >>><<< 11124 1726882383.75735: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882383.75771: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882383.75781: _low_level_execute_command(): starting 11124 1726882383.75786: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882383.50488-12255-31522467970030/ > /dev/null 2>&1 && sleep 0' 11124 1726882383.76486: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882383.76494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.76508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882383.76520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.76606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882383.76613: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882383.76623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.76647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882383.76655: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882383.76665: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882383.76674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.76683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882383.76695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.76702: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882383.76708: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882383.76718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.76798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882383.76817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882383.76830: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882383.76955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882383.78879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882383.78885: stdout chunk (state=3): >>><<< 11124 1726882383.78887: stderr chunk (state=3): >>><<< 11124 1726882383.78896: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882383.78903: handler run complete 11124 1726882383.78929: attempt loop complete, returning result 11124 1726882383.78932: _execute() done 11124 1726882383.78934: dumping result to json 11124 1726882383.78936: done dumping result, returning 11124 1726882383.78945: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0e448fcc-3ce9-8362-0f62-000000000444] 11124 1726882383.78952: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000444 11124 1726882383.79060: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000444 11124 1726882383.79063: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11124 1726882383.79134: no more pending results, returning what we have 11124 1726882383.79138: results queue empty 11124 1726882383.79139: checking for any_errors_fatal 11124 1726882383.79147: done checking for any_errors_fatal 11124 1726882383.79148: checking for max_fail_percentage 11124 1726882383.79150: done checking for max_fail_percentage 11124 1726882383.79151: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.79152: done checking to see if all hosts have failed 11124 1726882383.79153: getting the remaining hosts for this loop 11124 1726882383.79154: done getting the remaining hosts for this loop 11124 1726882383.79158: getting the next task for host managed_node1 11124 1726882383.79170: done getting next task for host managed_node1 11124 1726882383.79174: ^ task is: TASK: Set NM profile exist flag based on the profile files 11124 1726882383.79179: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.79184: getting variables 11124 1726882383.79187: in VariableManager get_vars() 11124 1726882383.79233: Calling all_inventory to load vars for managed_node1 11124 1726882383.79236: Calling groups_inventory to load vars for managed_node1 11124 1726882383.79239: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.79252: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.79255: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.79259: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.80966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.82801: done with get_vars() 11124 1726882383.82824: done getting variables 11124 1726882383.82897: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:33:03 -0400 (0:00:00.365) 0:00:24.071 ****** 11124 1726882383.82929: entering _queue_task() for managed_node1/set_fact 11124 1726882383.83269: worker is 1 (out of 1 available) 11124 1726882383.83286: exiting _queue_task() for managed_node1/set_fact 11124 1726882383.83299: done queuing things up, now waiting for results queue to drain 11124 1726882383.83300: waiting for pending results... 11124 1726882383.83591: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11124 1726882383.83695: in run() - task 0e448fcc-3ce9-8362-0f62-000000000445 11124 1726882383.83708: variable 'ansible_search_path' from source: unknown 11124 1726882383.83712: variable 'ansible_search_path' from source: unknown 11124 1726882383.83755: calling self._execute() 11124 1726882383.83856: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.83861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.83873: variable 'omit' from source: magic vars 11124 1726882383.84263: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.84284: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.84413: variable 'profile_stat' from source: set_fact 11124 1726882383.84427: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882383.84430: when evaluation is False, skipping this task 11124 1726882383.84433: _execute() done 11124 1726882383.84435: dumping result to json 11124 1726882383.84438: done dumping result, returning 11124 1726882383.84445: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0e448fcc-3ce9-8362-0f62-000000000445] 11124 1726882383.84453: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000445 11124 1726882383.84537: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000445 11124 1726882383.84541: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882383.84598: no more pending results, returning what we have 11124 1726882383.84603: results queue empty 11124 1726882383.84604: checking for any_errors_fatal 11124 1726882383.84612: done checking for any_errors_fatal 11124 1726882383.84613: checking for max_fail_percentage 11124 1726882383.84615: done checking for max_fail_percentage 11124 1726882383.84616: checking to see if all hosts have failed and the running result is not ok 11124 1726882383.84617: done checking to see if all hosts have failed 11124 1726882383.84618: getting the remaining hosts for this loop 11124 1726882383.84620: done getting the remaining hosts for this loop 11124 1726882383.84624: getting the next task for host managed_node1 11124 1726882383.84631: done getting next task for host managed_node1 11124 1726882383.84634: ^ task is: TASK: Get NM profile info 11124 1726882383.84639: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882383.84644: getting variables 11124 1726882383.84646: in VariableManager get_vars() 11124 1726882383.84697: Calling all_inventory to load vars for managed_node1 11124 1726882383.84700: Calling groups_inventory to load vars for managed_node1 11124 1726882383.84702: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882383.84719: Calling all_plugins_play to load vars for managed_node1 11124 1726882383.84722: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882383.84726: Calling groups_plugins_play to load vars for managed_node1 11124 1726882383.91251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882383.92933: done with get_vars() 11124 1726882383.92956: done getting variables 11124 1726882383.93005: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:33:03 -0400 (0:00:00.101) 0:00:24.172 ****** 11124 1726882383.93038: entering _queue_task() for managed_node1/shell 11124 1726882383.93377: worker is 1 (out of 1 available) 11124 1726882383.93389: exiting _queue_task() for managed_node1/shell 11124 1726882383.93401: done queuing things up, now waiting for results queue to drain 11124 1726882383.93403: waiting for pending results... 11124 1726882383.93700: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11124 1726882383.93806: in run() - task 0e448fcc-3ce9-8362-0f62-000000000446 11124 1726882383.93820: variable 'ansible_search_path' from source: unknown 11124 1726882383.93825: variable 'ansible_search_path' from source: unknown 11124 1726882383.93868: calling self._execute() 11124 1726882383.93968: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.93974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.93983: variable 'omit' from source: magic vars 11124 1726882383.94371: variable 'ansible_distribution_major_version' from source: facts 11124 1726882383.94382: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882383.94398: variable 'omit' from source: magic vars 11124 1726882383.94447: variable 'omit' from source: magic vars 11124 1726882383.94551: variable 'profile' from source: include params 11124 1726882383.94556: variable 'item' from source: include params 11124 1726882383.94621: variable 'item' from source: include params 11124 1726882383.94635: variable 'omit' from source: magic vars 11124 1726882383.94683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882383.94716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882383.94742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882383.94762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.94778: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882383.94804: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882383.94808: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.94810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.94920: Set connection var ansible_shell_executable to /bin/sh 11124 1726882383.94927: Set connection var ansible_shell_type to sh 11124 1726882383.94935: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882383.94946: Set connection var ansible_timeout to 10 11124 1726882383.94957: Set connection var ansible_pipelining to False 11124 1726882383.94960: Set connection var ansible_connection to ssh 11124 1726882383.94980: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.94983: variable 'ansible_connection' from source: unknown 11124 1726882383.94985: variable 'ansible_module_compression' from source: unknown 11124 1726882383.94987: variable 'ansible_shell_type' from source: unknown 11124 1726882383.94990: variable 'ansible_shell_executable' from source: unknown 11124 1726882383.94992: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882383.94997: variable 'ansible_pipelining' from source: unknown 11124 1726882383.95001: variable 'ansible_timeout' from source: unknown 11124 1726882383.95004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882383.95151: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882383.95165: variable 'omit' from source: magic vars 11124 1726882383.95170: starting attempt loop 11124 1726882383.95172: running the handler 11124 1726882383.95182: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882383.95205: _low_level_execute_command(): starting 11124 1726882383.95213: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882383.95999: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882383.96013: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.96022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882383.96045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.96083: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882383.96094: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882383.96110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.96116: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882383.96127: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882383.96134: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882383.96147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.96158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882383.96171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.96180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882383.96185: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882383.96199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.96275: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882383.96294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882383.96308: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882383.96440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882383.98153: stdout chunk (state=3): >>>/root <<< 11124 1726882383.98294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882383.98297: stdout chunk (state=3): >>><<< 11124 1726882383.98304: stderr chunk (state=3): >>><<< 11124 1726882383.98328: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882383.98339: _low_level_execute_command(): starting 11124 1726882383.98347: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830 `" && echo ansible-tmp-1726882383.983257-12276-48963784936830="` echo /root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830 `" ) && sleep 0' 11124 1726882383.99074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882383.99083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.99094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882383.99108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.99146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882383.99155: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882383.99165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.99182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882383.99189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882383.99196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882383.99204: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882383.99213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882383.99225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882383.99233: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882383.99238: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882383.99248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882383.99321: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882383.99338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882383.99352: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882383.99475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882384.01377: stdout chunk (state=3): >>>ansible-tmp-1726882383.983257-12276-48963784936830=/root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830 <<< 11124 1726882384.01546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882384.01552: stdout chunk (state=3): >>><<< 11124 1726882384.01557: stderr chunk (state=3): >>><<< 11124 1726882384.01577: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882383.983257-12276-48963784936830=/root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882384.01607: variable 'ansible_module_compression' from source: unknown 11124 1726882384.01663: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882384.01700: variable 'ansible_facts' from source: unknown 11124 1726882384.01774: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830/AnsiballZ_command.py 11124 1726882384.01910: Sending initial data 11124 1726882384.01913: Sent initial data (154 bytes) 11124 1726882384.02830: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882384.02836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.02853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.02859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.02902: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.02910: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882384.02918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.02930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882384.02938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882384.02945: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882384.02954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.02962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.02983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.02991: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.02997: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882384.03006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.03077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882384.03095: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882384.03102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882384.03239: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882384.04964: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882384.05056: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882384.05147: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp6_uzfj_a /root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830/AnsiballZ_command.py <<< 11124 1726882384.05236: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882384.06238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882384.06347: stderr chunk (state=3): >>><<< 11124 1726882384.06353: stdout chunk (state=3): >>><<< 11124 1726882384.06371: done transferring module to remote 11124 1726882384.06385: _low_level_execute_command(): starting 11124 1726882384.06389: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830/ /root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830/AnsiballZ_command.py && sleep 0' 11124 1726882384.07009: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882384.07020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.07029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.07048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.07093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.07100: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882384.07109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.07123: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882384.07131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882384.07137: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882384.07145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.07162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.07176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.07184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.07190: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882384.07199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.07279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882384.07296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882384.07307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882384.07429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882384.09262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882384.09310: stderr chunk (state=3): >>><<< 11124 1726882384.09314: stdout chunk (state=3): >>><<< 11124 1726882384.09328: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882384.09331: _low_level_execute_command(): starting 11124 1726882384.09336: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830/AnsiballZ_command.py && sleep 0' 11124 1726882384.10070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882384.10074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.10076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.10078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.10080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.10082: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882384.10083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.10085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882384.10087: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882384.10089: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882384.10091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.10093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.10095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.10097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.10099: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882384.10101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.10137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882384.10150: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882384.10163: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882384.10297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882384.25751: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:04.233214", "end": "2024-09-20 21:33:04.256125", "delta": "0:00:00.022911", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882384.27034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882384.27038: stdout chunk (state=3): >>><<< 11124 1726882384.27040: stderr chunk (state=3): >>><<< 11124 1726882384.27061: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:33:04.233214", "end": "2024-09-20 21:33:04.256125", "delta": "0:00:00.022911", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882384.27101: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882384.27106: _low_level_execute_command(): starting 11124 1726882384.27111: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882383.983257-12276-48963784936830/ > /dev/null 2>&1 && sleep 0' 11124 1726882384.27570: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.27598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.27623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.27628: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882384.27637: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.27686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882384.27694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.27769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882384.27885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882384.27922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882384.28121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882384.29972: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882384.29978: stdout chunk (state=3): >>><<< 11124 1726882384.29983: stderr chunk (state=3): >>><<< 11124 1726882384.29998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882384.30004: handler run complete 11124 1726882384.30022: Evaluated conditional (False): False 11124 1726882384.30031: attempt loop complete, returning result 11124 1726882384.30033: _execute() done 11124 1726882384.30036: dumping result to json 11124 1726882384.30041: done dumping result, returning 11124 1726882384.30047: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0e448fcc-3ce9-8362-0f62-000000000446] 11124 1726882384.30054: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000446 11124 1726882384.30150: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000446 11124 1726882384.30153: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.022911", "end": "2024-09-20 21:33:04.256125", "rc": 0, "start": "2024-09-20 21:33:04.233214" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11124 1726882384.30251: no more pending results, returning what we have 11124 1726882384.30253: results queue empty 11124 1726882384.30255: checking for any_errors_fatal 11124 1726882384.30260: done checking for any_errors_fatal 11124 1726882384.30260: checking for max_fail_percentage 11124 1726882384.30264: done checking for max_fail_percentage 11124 1726882384.30264: checking to see if all hosts have failed and the running result is not ok 11124 1726882384.30265: done checking to see if all hosts have failed 11124 1726882384.30266: getting the remaining hosts for this loop 11124 1726882384.30268: done getting the remaining hosts for this loop 11124 1726882384.30271: getting the next task for host managed_node1 11124 1726882384.30278: done getting next task for host managed_node1 11124 1726882384.30280: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11124 1726882384.30283: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882384.30287: getting variables 11124 1726882384.30288: in VariableManager get_vars() 11124 1726882384.30325: Calling all_inventory to load vars for managed_node1 11124 1726882384.30327: Calling groups_inventory to load vars for managed_node1 11124 1726882384.30329: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882384.30340: Calling all_plugins_play to load vars for managed_node1 11124 1726882384.30342: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882384.30344: Calling groups_plugins_play to load vars for managed_node1 11124 1726882384.31429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882384.33131: done with get_vars() 11124 1726882384.33159: done getting variables 11124 1726882384.33223: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:33:04 -0400 (0:00:00.402) 0:00:24.575 ****** 11124 1726882384.33259: entering _queue_task() for managed_node1/set_fact 11124 1726882384.33589: worker is 1 (out of 1 available) 11124 1726882384.33603: exiting _queue_task() for managed_node1/set_fact 11124 1726882384.33614: done queuing things up, now waiting for results queue to drain 11124 1726882384.33616: waiting for pending results... 11124 1726882384.33893: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11124 1726882384.34018: in run() - task 0e448fcc-3ce9-8362-0f62-000000000447 11124 1726882384.34045: variable 'ansible_search_path' from source: unknown 11124 1726882384.34054: variable 'ansible_search_path' from source: unknown 11124 1726882384.34102: calling self._execute() 11124 1726882384.34216: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.34229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.34242: variable 'omit' from source: magic vars 11124 1726882384.34626: variable 'ansible_distribution_major_version' from source: facts 11124 1726882384.34644: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882384.34781: variable 'nm_profile_exists' from source: set_fact 11124 1726882384.34801: Evaluated conditional (nm_profile_exists.rc == 0): True 11124 1726882384.34811: variable 'omit' from source: magic vars 11124 1726882384.34867: variable 'omit' from source: magic vars 11124 1726882384.34903: variable 'omit' from source: magic vars 11124 1726882384.34950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882384.34992: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882384.35016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882384.35042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.35058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.35096: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882384.35110: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.35120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.35227: Set connection var ansible_shell_executable to /bin/sh 11124 1726882384.35240: Set connection var ansible_shell_type to sh 11124 1726882384.35256: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882384.35268: Set connection var ansible_timeout to 10 11124 1726882384.35278: Set connection var ansible_pipelining to False 11124 1726882384.35284: Set connection var ansible_connection to ssh 11124 1726882384.35310: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.35317: variable 'ansible_connection' from source: unknown 11124 1726882384.35324: variable 'ansible_module_compression' from source: unknown 11124 1726882384.35331: variable 'ansible_shell_type' from source: unknown 11124 1726882384.35337: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.35343: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.35350: variable 'ansible_pipelining' from source: unknown 11124 1726882384.35357: variable 'ansible_timeout' from source: unknown 11124 1726882384.35370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.35511: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882384.35528: variable 'omit' from source: magic vars 11124 1726882384.35537: starting attempt loop 11124 1726882384.35543: running the handler 11124 1726882384.35559: handler run complete 11124 1726882384.35577: attempt loop complete, returning result 11124 1726882384.35585: _execute() done 11124 1726882384.35591: dumping result to json 11124 1726882384.35597: done dumping result, returning 11124 1726882384.35608: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0e448fcc-3ce9-8362-0f62-000000000447] 11124 1726882384.35616: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000447 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11124 1726882384.35764: no more pending results, returning what we have 11124 1726882384.35768: results queue empty 11124 1726882384.35769: checking for any_errors_fatal 11124 1726882384.35776: done checking for any_errors_fatal 11124 1726882384.35777: checking for max_fail_percentage 11124 1726882384.35779: done checking for max_fail_percentage 11124 1726882384.35780: checking to see if all hosts have failed and the running result is not ok 11124 1726882384.35781: done checking to see if all hosts have failed 11124 1726882384.35781: getting the remaining hosts for this loop 11124 1726882384.35783: done getting the remaining hosts for this loop 11124 1726882384.35786: getting the next task for host managed_node1 11124 1726882384.35795: done getting next task for host managed_node1 11124 1726882384.35798: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11124 1726882384.35802: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882384.35806: getting variables 11124 1726882384.35808: in VariableManager get_vars() 11124 1726882384.35852: Calling all_inventory to load vars for managed_node1 11124 1726882384.35855: Calling groups_inventory to load vars for managed_node1 11124 1726882384.35857: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882384.35871: Calling all_plugins_play to load vars for managed_node1 11124 1726882384.35874: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882384.35878: Calling groups_plugins_play to load vars for managed_node1 11124 1726882384.36885: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000447 11124 1726882384.36890: WORKER PROCESS EXITING 11124 1726882384.37596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882384.39290: done with get_vars() 11124 1726882384.39316: done getting variables 11124 1726882384.39378: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882384.39494: variable 'profile' from source: include params 11124 1726882384.39498: variable 'item' from source: include params 11124 1726882384.39556: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:33:04 -0400 (0:00:00.063) 0:00:24.638 ****** 11124 1726882384.39595: entering _queue_task() for managed_node1/command 11124 1726882384.39910: worker is 1 (out of 1 available) 11124 1726882384.39923: exiting _queue_task() for managed_node1/command 11124 1726882384.39935: done queuing things up, now waiting for results queue to drain 11124 1726882384.39937: waiting for pending results... 11124 1726882384.40208: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11124 1726882384.40338: in run() - task 0e448fcc-3ce9-8362-0f62-000000000449 11124 1726882384.40359: variable 'ansible_search_path' from source: unknown 11124 1726882384.40373: variable 'ansible_search_path' from source: unknown 11124 1726882384.40418: calling self._execute() 11124 1726882384.40522: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.40534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.40549: variable 'omit' from source: magic vars 11124 1726882384.40911: variable 'ansible_distribution_major_version' from source: facts 11124 1726882384.40932: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882384.41057: variable 'profile_stat' from source: set_fact 11124 1726882384.41079: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882384.41086: when evaluation is False, skipping this task 11124 1726882384.41093: _execute() done 11124 1726882384.41101: dumping result to json 11124 1726882384.41108: done dumping result, returning 11124 1726882384.41117: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0e448fcc-3ce9-8362-0f62-000000000449] 11124 1726882384.41127: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000449 11124 1726882384.41237: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000449 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882384.41298: no more pending results, returning what we have 11124 1726882384.41303: results queue empty 11124 1726882384.41304: checking for any_errors_fatal 11124 1726882384.41312: done checking for any_errors_fatal 11124 1726882384.41313: checking for max_fail_percentage 11124 1726882384.41315: done checking for max_fail_percentage 11124 1726882384.41316: checking to see if all hosts have failed and the running result is not ok 11124 1726882384.41317: done checking to see if all hosts have failed 11124 1726882384.41318: getting the remaining hosts for this loop 11124 1726882384.41319: done getting the remaining hosts for this loop 11124 1726882384.41322: getting the next task for host managed_node1 11124 1726882384.41330: done getting next task for host managed_node1 11124 1726882384.41333: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11124 1726882384.41338: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882384.41343: getting variables 11124 1726882384.41345: in VariableManager get_vars() 11124 1726882384.41391: Calling all_inventory to load vars for managed_node1 11124 1726882384.41395: Calling groups_inventory to load vars for managed_node1 11124 1726882384.41397: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882384.41411: Calling all_plugins_play to load vars for managed_node1 11124 1726882384.41414: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882384.41417: Calling groups_plugins_play to load vars for managed_node1 11124 1726882384.42381: WORKER PROCESS EXITING 11124 1726882384.43281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882384.46408: done with get_vars() 11124 1726882384.46439: done getting variables 11124 1726882384.46540: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882384.46669: variable 'profile' from source: include params 11124 1726882384.46672: variable 'item' from source: include params 11124 1726882384.46733: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:33:04 -0400 (0:00:00.071) 0:00:24.710 ****** 11124 1726882384.46767: entering _queue_task() for managed_node1/set_fact 11124 1726882384.47119: worker is 1 (out of 1 available) 11124 1726882384.47131: exiting _queue_task() for managed_node1/set_fact 11124 1726882384.47142: done queuing things up, now waiting for results queue to drain 11124 1726882384.47144: waiting for pending results... 11124 1726882384.47433: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11124 1726882384.47567: in run() - task 0e448fcc-3ce9-8362-0f62-00000000044a 11124 1726882384.47596: variable 'ansible_search_path' from source: unknown 11124 1726882384.47604: variable 'ansible_search_path' from source: unknown 11124 1726882384.47647: calling self._execute() 11124 1726882384.47756: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.47770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.47784: variable 'omit' from source: magic vars 11124 1726882384.48165: variable 'ansible_distribution_major_version' from source: facts 11124 1726882384.48184: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882384.49870: variable 'profile_stat' from source: set_fact 11124 1726882384.49924: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882384.50021: when evaluation is False, skipping this task 11124 1726882384.50030: _execute() done 11124 1726882384.50037: dumping result to json 11124 1726882384.50044: done dumping result, returning 11124 1726882384.50054: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0e448fcc-3ce9-8362-0f62-00000000044a] 11124 1726882384.50066: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000044a skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882384.50217: no more pending results, returning what we have 11124 1726882384.50221: results queue empty 11124 1726882384.50223: checking for any_errors_fatal 11124 1726882384.50229: done checking for any_errors_fatal 11124 1726882384.50229: checking for max_fail_percentage 11124 1726882384.50231: done checking for max_fail_percentage 11124 1726882384.50232: checking to see if all hosts have failed and the running result is not ok 11124 1726882384.50234: done checking to see if all hosts have failed 11124 1726882384.50234: getting the remaining hosts for this loop 11124 1726882384.50236: done getting the remaining hosts for this loop 11124 1726882384.50240: getting the next task for host managed_node1 11124 1726882384.50247: done getting next task for host managed_node1 11124 1726882384.50250: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11124 1726882384.50255: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882384.50261: getting variables 11124 1726882384.50266: in VariableManager get_vars() 11124 1726882384.50313: Calling all_inventory to load vars for managed_node1 11124 1726882384.50317: Calling groups_inventory to load vars for managed_node1 11124 1726882384.50319: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882384.50334: Calling all_plugins_play to load vars for managed_node1 11124 1726882384.50337: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882384.50341: Calling groups_plugins_play to load vars for managed_node1 11124 1726882384.51585: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000044a 11124 1726882384.51589: WORKER PROCESS EXITING 11124 1726882384.52261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882384.54272: done with get_vars() 11124 1726882384.54306: done getting variables 11124 1726882384.54407: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882384.54520: variable 'profile' from source: include params 11124 1726882384.54524: variable 'item' from source: include params 11124 1726882384.54588: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:33:04 -0400 (0:00:00.078) 0:00:24.789 ****** 11124 1726882384.54649: entering _queue_task() for managed_node1/command 11124 1726882384.55021: worker is 1 (out of 1 available) 11124 1726882384.55033: exiting _queue_task() for managed_node1/command 11124 1726882384.55045: done queuing things up, now waiting for results queue to drain 11124 1726882384.55047: waiting for pending results... 11124 1726882384.55336: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 11124 1726882384.55466: in run() - task 0e448fcc-3ce9-8362-0f62-00000000044b 11124 1726882384.55491: variable 'ansible_search_path' from source: unknown 11124 1726882384.55499: variable 'ansible_search_path' from source: unknown 11124 1726882384.56216: calling self._execute() 11124 1726882384.56435: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.56447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.56461: variable 'omit' from source: magic vars 11124 1726882384.57007: variable 'ansible_distribution_major_version' from source: facts 11124 1726882384.57026: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882384.57157: variable 'profile_stat' from source: set_fact 11124 1726882384.57184: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882384.57191: when evaluation is False, skipping this task 11124 1726882384.57198: _execute() done 11124 1726882384.57205: dumping result to json 11124 1726882384.57212: done dumping result, returning 11124 1726882384.57222: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0e448fcc-3ce9-8362-0f62-00000000044b] 11124 1726882384.57231: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000044b skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882384.57390: no more pending results, returning what we have 11124 1726882384.57395: results queue empty 11124 1726882384.57396: checking for any_errors_fatal 11124 1726882384.57403: done checking for any_errors_fatal 11124 1726882384.57404: checking for max_fail_percentage 11124 1726882384.57407: done checking for max_fail_percentage 11124 1726882384.57408: checking to see if all hosts have failed and the running result is not ok 11124 1726882384.57409: done checking to see if all hosts have failed 11124 1726882384.57409: getting the remaining hosts for this loop 11124 1726882384.57411: done getting the remaining hosts for this loop 11124 1726882384.57415: getting the next task for host managed_node1 11124 1726882384.57423: done getting next task for host managed_node1 11124 1726882384.57425: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11124 1726882384.57430: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882384.57435: getting variables 11124 1726882384.57437: in VariableManager get_vars() 11124 1726882384.57487: Calling all_inventory to load vars for managed_node1 11124 1726882384.57491: Calling groups_inventory to load vars for managed_node1 11124 1726882384.57494: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882384.57509: Calling all_plugins_play to load vars for managed_node1 11124 1726882384.57513: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882384.57516: Calling groups_plugins_play to load vars for managed_node1 11124 1726882384.58506: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000044b 11124 1726882384.58510: WORKER PROCESS EXITING 11124 1726882384.59320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882384.62009: done with get_vars() 11124 1726882384.62043: done getting variables 11124 1726882384.62109: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882384.62226: variable 'profile' from source: include params 11124 1726882384.62230: variable 'item' from source: include params 11124 1726882384.62296: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:33:04 -0400 (0:00:00.076) 0:00:24.865 ****** 11124 1726882384.62329: entering _queue_task() for managed_node1/set_fact 11124 1726882384.62678: worker is 1 (out of 1 available) 11124 1726882384.62691: exiting _queue_task() for managed_node1/set_fact 11124 1726882384.62707: done queuing things up, now waiting for results queue to drain 11124 1726882384.62709: waiting for pending results... 11124 1726882384.63011: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11124 1726882384.63143: in run() - task 0e448fcc-3ce9-8362-0f62-00000000044c 11124 1726882384.63170: variable 'ansible_search_path' from source: unknown 11124 1726882384.63179: variable 'ansible_search_path' from source: unknown 11124 1726882384.63222: calling self._execute() 11124 1726882384.63330: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.63341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.63356: variable 'omit' from source: magic vars 11124 1726882384.63736: variable 'ansible_distribution_major_version' from source: facts 11124 1726882384.63755: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882384.63889: variable 'profile_stat' from source: set_fact 11124 1726882384.63907: Evaluated conditional (profile_stat.stat.exists): False 11124 1726882384.63922: when evaluation is False, skipping this task 11124 1726882384.63930: _execute() done 11124 1726882384.63938: dumping result to json 11124 1726882384.63946: done dumping result, returning 11124 1726882384.63957: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0e448fcc-3ce9-8362-0f62-00000000044c] 11124 1726882384.63970: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000044c skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11124 1726882384.64117: no more pending results, returning what we have 11124 1726882384.64121: results queue empty 11124 1726882384.64122: checking for any_errors_fatal 11124 1726882384.64128: done checking for any_errors_fatal 11124 1726882384.64129: checking for max_fail_percentage 11124 1726882384.64131: done checking for max_fail_percentage 11124 1726882384.64132: checking to see if all hosts have failed and the running result is not ok 11124 1726882384.64134: done checking to see if all hosts have failed 11124 1726882384.64134: getting the remaining hosts for this loop 11124 1726882384.64136: done getting the remaining hosts for this loop 11124 1726882384.64140: getting the next task for host managed_node1 11124 1726882384.64148: done getting next task for host managed_node1 11124 1726882384.64151: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11124 1726882384.64155: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882384.64161: getting variables 11124 1726882384.64163: in VariableManager get_vars() 11124 1726882384.64210: Calling all_inventory to load vars for managed_node1 11124 1726882384.64213: Calling groups_inventory to load vars for managed_node1 11124 1726882384.64215: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882384.64230: Calling all_plugins_play to load vars for managed_node1 11124 1726882384.64233: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882384.64236: Calling groups_plugins_play to load vars for managed_node1 11124 1726882384.65454: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000044c 11124 1726882384.65458: WORKER PROCESS EXITING 11124 1726882384.66125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882384.69898: done with get_vars() 11124 1726882384.69931: done getting variables 11124 1726882384.69992: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882384.70108: variable 'profile' from source: include params 11124 1726882384.70111: variable 'item' from source: include params 11124 1726882384.70169: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:33:04 -0400 (0:00:00.078) 0:00:24.944 ****** 11124 1726882384.70199: entering _queue_task() for managed_node1/assert 11124 1726882384.71225: worker is 1 (out of 1 available) 11124 1726882384.71238: exiting _queue_task() for managed_node1/assert 11124 1726882384.71249: done queuing things up, now waiting for results queue to drain 11124 1726882384.71251: waiting for pending results... 11124 1726882384.72198: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' 11124 1726882384.72406: in run() - task 0e448fcc-3ce9-8362-0f62-00000000026f 11124 1726882384.72419: variable 'ansible_search_path' from source: unknown 11124 1726882384.72422: variable 'ansible_search_path' from source: unknown 11124 1726882384.72460: calling self._execute() 11124 1726882384.72735: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.72741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.72752: variable 'omit' from source: magic vars 11124 1726882384.73205: variable 'ansible_distribution_major_version' from source: facts 11124 1726882384.73226: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882384.73233: variable 'omit' from source: magic vars 11124 1726882384.73313: variable 'omit' from source: magic vars 11124 1726882384.73419: variable 'profile' from source: include params 11124 1726882384.73423: variable 'item' from source: include params 11124 1726882384.73498: variable 'item' from source: include params 11124 1726882384.73517: variable 'omit' from source: magic vars 11124 1726882384.73570: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882384.73606: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882384.73629: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882384.73649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.73669: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.73699: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882384.73703: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.73706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.73815: Set connection var ansible_shell_executable to /bin/sh 11124 1726882384.73822: Set connection var ansible_shell_type to sh 11124 1726882384.73830: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882384.73836: Set connection var ansible_timeout to 10 11124 1726882384.73841: Set connection var ansible_pipelining to False 11124 1726882384.73844: Set connection var ansible_connection to ssh 11124 1726882384.73881: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.73884: variable 'ansible_connection' from source: unknown 11124 1726882384.73886: variable 'ansible_module_compression' from source: unknown 11124 1726882384.73889: variable 'ansible_shell_type' from source: unknown 11124 1726882384.73891: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.73893: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.73898: variable 'ansible_pipelining' from source: unknown 11124 1726882384.73900: variable 'ansible_timeout' from source: unknown 11124 1726882384.73903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.74045: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882384.74060: variable 'omit' from source: magic vars 11124 1726882384.74067: starting attempt loop 11124 1726882384.74070: running the handler 11124 1726882384.74186: variable 'lsr_net_profile_exists' from source: set_fact 11124 1726882384.74196: Evaluated conditional (lsr_net_profile_exists): True 11124 1726882384.74205: handler run complete 11124 1726882384.74222: attempt loop complete, returning result 11124 1726882384.74225: _execute() done 11124 1726882384.74227: dumping result to json 11124 1726882384.74230: done dumping result, returning 11124 1726882384.74235: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' [0e448fcc-3ce9-8362-0f62-00000000026f] 11124 1726882384.74241: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000026f 11124 1726882384.74332: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000026f 11124 1726882384.74334: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882384.74381: no more pending results, returning what we have 11124 1726882384.74384: results queue empty 11124 1726882384.74385: checking for any_errors_fatal 11124 1726882384.74391: done checking for any_errors_fatal 11124 1726882384.74392: checking for max_fail_percentage 11124 1726882384.74393: done checking for max_fail_percentage 11124 1726882384.74394: checking to see if all hosts have failed and the running result is not ok 11124 1726882384.74395: done checking to see if all hosts have failed 11124 1726882384.74396: getting the remaining hosts for this loop 11124 1726882384.74397: done getting the remaining hosts for this loop 11124 1726882384.74400: getting the next task for host managed_node1 11124 1726882384.74406: done getting next task for host managed_node1 11124 1726882384.74409: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11124 1726882384.74412: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882384.74417: getting variables 11124 1726882384.74419: in VariableManager get_vars() 11124 1726882384.74461: Calling all_inventory to load vars for managed_node1 11124 1726882384.74466: Calling groups_inventory to load vars for managed_node1 11124 1726882384.74469: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882384.74479: Calling all_plugins_play to load vars for managed_node1 11124 1726882384.74482: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882384.74485: Calling groups_plugins_play to load vars for managed_node1 11124 1726882384.76432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882384.78508: done with get_vars() 11124 1726882384.78531: done getting variables 11124 1726882384.78595: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882384.78711: variable 'profile' from source: include params 11124 1726882384.78715: variable 'item' from source: include params 11124 1726882384.78773: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:33:04 -0400 (0:00:00.086) 0:00:25.030 ****** 11124 1726882384.78809: entering _queue_task() for managed_node1/assert 11124 1726882384.79138: worker is 1 (out of 1 available) 11124 1726882384.79153: exiting _queue_task() for managed_node1/assert 11124 1726882384.79167: done queuing things up, now waiting for results queue to drain 11124 1726882384.79169: waiting for pending results... 11124 1726882384.79509: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11124 1726882384.79639: in run() - task 0e448fcc-3ce9-8362-0f62-000000000270 11124 1726882384.79661: variable 'ansible_search_path' from source: unknown 11124 1726882384.79674: variable 'ansible_search_path' from source: unknown 11124 1726882384.79722: calling self._execute() 11124 1726882384.79821: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.79831: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.79854: variable 'omit' from source: magic vars 11124 1726882384.80232: variable 'ansible_distribution_major_version' from source: facts 11124 1726882384.80248: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882384.80265: variable 'omit' from source: magic vars 11124 1726882384.80318: variable 'omit' from source: magic vars 11124 1726882384.80511: variable 'profile' from source: include params 11124 1726882384.80519: variable 'item' from source: include params 11124 1726882384.80597: variable 'item' from source: include params 11124 1726882384.80687: variable 'omit' from source: magic vars 11124 1726882384.80844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882384.80887: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882384.80920: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882384.81060: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.81078: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.81116: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882384.81126: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.81133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.81244: Set connection var ansible_shell_executable to /bin/sh 11124 1726882384.81380: Set connection var ansible_shell_type to sh 11124 1726882384.81392: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882384.81405: Set connection var ansible_timeout to 10 11124 1726882384.81415: Set connection var ansible_pipelining to False 11124 1726882384.81421: Set connection var ansible_connection to ssh 11124 1726882384.81454: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.81575: variable 'ansible_connection' from source: unknown 11124 1726882384.81584: variable 'ansible_module_compression' from source: unknown 11124 1726882384.81593: variable 'ansible_shell_type' from source: unknown 11124 1726882384.81599: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.81605: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.81615: variable 'ansible_pipelining' from source: unknown 11124 1726882384.81623: variable 'ansible_timeout' from source: unknown 11124 1726882384.81633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.81860: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882384.81928: variable 'omit' from source: magic vars 11124 1726882384.81939: starting attempt loop 11124 1726882384.81975: running the handler 11124 1726882384.82280: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11124 1726882384.82289: Evaluated conditional (lsr_net_profile_ansible_managed): True 11124 1726882384.82298: handler run complete 11124 1726882384.82322: attempt loop complete, returning result 11124 1726882384.82328: _execute() done 11124 1726882384.82334: dumping result to json 11124 1726882384.82341: done dumping result, returning 11124 1726882384.82362: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0e448fcc-3ce9-8362-0f62-000000000270] 11124 1726882384.82378: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000270 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882384.82555: no more pending results, returning what we have 11124 1726882384.82559: results queue empty 11124 1726882384.82560: checking for any_errors_fatal 11124 1726882384.82576: done checking for any_errors_fatal 11124 1726882384.82577: checking for max_fail_percentage 11124 1726882384.82579: done checking for max_fail_percentage 11124 1726882384.82580: checking to see if all hosts have failed and the running result is not ok 11124 1726882384.82582: done checking to see if all hosts have failed 11124 1726882384.82583: getting the remaining hosts for this loop 11124 1726882384.82584: done getting the remaining hosts for this loop 11124 1726882384.82592: getting the next task for host managed_node1 11124 1726882384.82599: done getting next task for host managed_node1 11124 1726882384.82602: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11124 1726882384.82605: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882384.82610: getting variables 11124 1726882384.82611: in VariableManager get_vars() 11124 1726882384.82660: Calling all_inventory to load vars for managed_node1 11124 1726882384.82667: Calling groups_inventory to load vars for managed_node1 11124 1726882384.82670: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882384.82683: Calling all_plugins_play to load vars for managed_node1 11124 1726882384.82687: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882384.82690: Calling groups_plugins_play to load vars for managed_node1 11124 1726882384.83477: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000270 11124 1726882384.83481: WORKER PROCESS EXITING 11124 1726882384.84724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882384.86973: done with get_vars() 11124 1726882384.87000: done getting variables 11124 1726882384.87046: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882384.87151: variable 'profile' from source: include params 11124 1726882384.87155: variable 'item' from source: include params 11124 1726882384.87213: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:33:04 -0400 (0:00:00.084) 0:00:25.115 ****** 11124 1726882384.87249: entering _queue_task() for managed_node1/assert 11124 1726882384.87584: worker is 1 (out of 1 available) 11124 1726882384.87598: exiting _queue_task() for managed_node1/assert 11124 1726882384.87611: done queuing things up, now waiting for results queue to drain 11124 1726882384.87612: waiting for pending results... 11124 1726882384.87895: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 11124 1726882384.88006: in run() - task 0e448fcc-3ce9-8362-0f62-000000000271 11124 1726882384.88027: variable 'ansible_search_path' from source: unknown 11124 1726882384.88035: variable 'ansible_search_path' from source: unknown 11124 1726882384.88089: calling self._execute() 11124 1726882384.88216: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.88228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.88243: variable 'omit' from source: magic vars 11124 1726882384.88969: variable 'ansible_distribution_major_version' from source: facts 11124 1726882384.88981: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882384.88988: variable 'omit' from source: magic vars 11124 1726882384.89036: variable 'omit' from source: magic vars 11124 1726882384.89149: variable 'profile' from source: include params 11124 1726882384.89152: variable 'item' from source: include params 11124 1726882384.89218: variable 'item' from source: include params 11124 1726882384.89246: variable 'omit' from source: magic vars 11124 1726882384.89292: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882384.89327: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882384.89359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882384.89378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.89392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.89422: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882384.89426: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.89428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.89537: Set connection var ansible_shell_executable to /bin/sh 11124 1726882384.89545: Set connection var ansible_shell_type to sh 11124 1726882384.89566: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882384.89571: Set connection var ansible_timeout to 10 11124 1726882384.89577: Set connection var ansible_pipelining to False 11124 1726882384.89580: Set connection var ansible_connection to ssh 11124 1726882384.89604: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.89607: variable 'ansible_connection' from source: unknown 11124 1726882384.89609: variable 'ansible_module_compression' from source: unknown 11124 1726882384.89612: variable 'ansible_shell_type' from source: unknown 11124 1726882384.89614: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.89616: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.89620: variable 'ansible_pipelining' from source: unknown 11124 1726882384.89624: variable 'ansible_timeout' from source: unknown 11124 1726882384.89626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.89773: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882384.89790: variable 'omit' from source: magic vars 11124 1726882384.89795: starting attempt loop 11124 1726882384.89798: running the handler 11124 1726882384.89918: variable 'lsr_net_profile_fingerprint' from source: set_fact 11124 1726882384.89922: Evaluated conditional (lsr_net_profile_fingerprint): True 11124 1726882384.89929: handler run complete 11124 1726882384.89944: attempt loop complete, returning result 11124 1726882384.89947: _execute() done 11124 1726882384.89950: dumping result to json 11124 1726882384.89954: done dumping result, returning 11124 1726882384.89962: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 [0e448fcc-3ce9-8362-0f62-000000000271] 11124 1726882384.89969: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000271 11124 1726882384.90070: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000271 11124 1726882384.90074: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11124 1726882384.90120: no more pending results, returning what we have 11124 1726882384.90123: results queue empty 11124 1726882384.90124: checking for any_errors_fatal 11124 1726882384.90131: done checking for any_errors_fatal 11124 1726882384.90132: checking for max_fail_percentage 11124 1726882384.90133: done checking for max_fail_percentage 11124 1726882384.90134: checking to see if all hosts have failed and the running result is not ok 11124 1726882384.90136: done checking to see if all hosts have failed 11124 1726882384.90136: getting the remaining hosts for this loop 11124 1726882384.90138: done getting the remaining hosts for this loop 11124 1726882384.90141: getting the next task for host managed_node1 11124 1726882384.90148: done getting next task for host managed_node1 11124 1726882384.90150: ^ task is: TASK: ** TEST check polling interval 11124 1726882384.90153: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882384.90157: getting variables 11124 1726882384.90159: in VariableManager get_vars() 11124 1726882384.90203: Calling all_inventory to load vars for managed_node1 11124 1726882384.90206: Calling groups_inventory to load vars for managed_node1 11124 1726882384.90209: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882384.90221: Calling all_plugins_play to load vars for managed_node1 11124 1726882384.90224: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882384.90227: Calling groups_plugins_play to load vars for managed_node1 11124 1726882384.91756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882384.93477: done with get_vars() 11124 1726882384.93510: done getting variables 11124 1726882384.93572: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:75 Friday 20 September 2024 21:33:04 -0400 (0:00:00.063) 0:00:25.178 ****** 11124 1726882384.93601: entering _queue_task() for managed_node1/command 11124 1726882384.93943: worker is 1 (out of 1 available) 11124 1726882384.93955: exiting _queue_task() for managed_node1/command 11124 1726882384.93971: done queuing things up, now waiting for results queue to drain 11124 1726882384.93973: waiting for pending results... 11124 1726882384.94256: running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval 11124 1726882384.94366: in run() - task 0e448fcc-3ce9-8362-0f62-000000000071 11124 1726882384.94387: variable 'ansible_search_path' from source: unknown 11124 1726882384.94435: calling self._execute() 11124 1726882384.94546: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.94557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.94573: variable 'omit' from source: magic vars 11124 1726882384.94939: variable 'ansible_distribution_major_version' from source: facts 11124 1726882384.94961: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882384.94975: variable 'omit' from source: magic vars 11124 1726882384.95009: variable 'omit' from source: magic vars 11124 1726882384.95107: variable 'controller_device' from source: play vars 11124 1726882384.95128: variable 'omit' from source: magic vars 11124 1726882384.95177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882384.95216: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882384.95242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882384.95266: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.95287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882384.95322: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882384.95331: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.95338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.95449: Set connection var ansible_shell_executable to /bin/sh 11124 1726882384.95465: Set connection var ansible_shell_type to sh 11124 1726882384.95479: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882384.95489: Set connection var ansible_timeout to 10 11124 1726882384.95503: Set connection var ansible_pipelining to False 11124 1726882384.95509: Set connection var ansible_connection to ssh 11124 1726882384.95534: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.95541: variable 'ansible_connection' from source: unknown 11124 1726882384.95547: variable 'ansible_module_compression' from source: unknown 11124 1726882384.95553: variable 'ansible_shell_type' from source: unknown 11124 1726882384.95559: variable 'ansible_shell_executable' from source: unknown 11124 1726882384.95566: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882384.95573: variable 'ansible_pipelining' from source: unknown 11124 1726882384.95579: variable 'ansible_timeout' from source: unknown 11124 1726882384.95586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882384.95722: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882384.95738: variable 'omit' from source: magic vars 11124 1726882384.95747: starting attempt loop 11124 1726882384.95753: running the handler 11124 1726882384.95774: _low_level_execute_command(): starting 11124 1726882384.95786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882384.96519: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882384.96536: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.96552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.96580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.96624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.96636: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882384.96650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.96675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882384.96690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882384.96701: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882384.96711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.96723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.96735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.96746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.96756: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882384.96772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.96843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882384.96869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882384.96889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882384.97026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882384.98703: stdout chunk (state=3): >>>/root <<< 11124 1726882384.98793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882384.98893: stderr chunk (state=3): >>><<< 11124 1726882384.98906: stdout chunk (state=3): >>><<< 11124 1726882384.99041: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882384.99044: _low_level_execute_command(): starting 11124 1726882384.99047: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967 `" && echo ansible-tmp-1726882384.989382-12323-127266804750967="` echo /root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967 `" ) && sleep 0' 11124 1726882384.99667: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882384.99684: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.99707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.99727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.99772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.99784: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882384.99804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882384.99821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882384.99831: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882384.99841: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882384.99853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882384.99870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882384.99886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882384.99898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882384.99917: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882384.99931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.00007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.00038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.00055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.00187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.02111: stdout chunk (state=3): >>>ansible-tmp-1726882384.989382-12323-127266804750967=/root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967 <<< 11124 1726882385.02215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.02304: stderr chunk (state=3): >>><<< 11124 1726882385.02315: stdout chunk (state=3): >>><<< 11124 1726882385.02619: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882384.989382-12323-127266804750967=/root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.02622: variable 'ansible_module_compression' from source: unknown 11124 1726882385.02624: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882385.02626: variable 'ansible_facts' from source: unknown 11124 1726882385.02628: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967/AnsiballZ_command.py 11124 1726882385.02688: Sending initial data 11124 1726882385.02691: Sent initial data (155 bytes) 11124 1726882385.03631: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.03646: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.03662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.03684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.03731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.03743: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.03758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.03778: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.03789: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.03800: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.03816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.03831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.03847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.03860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.03874: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.03886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.03962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.03987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.04001: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.04121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.05852: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882385.05947: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882385.06045: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp5b1fk95w /root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967/AnsiballZ_command.py <<< 11124 1726882385.06134: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882385.07291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.07455: stderr chunk (state=3): >>><<< 11124 1726882385.07459: stdout chunk (state=3): >>><<< 11124 1726882385.07462: done transferring module to remote 11124 1726882385.07471: _low_level_execute_command(): starting 11124 1726882385.07474: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967/ /root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967/AnsiballZ_command.py && sleep 0' 11124 1726882385.07863: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.07873: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.07883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.07891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.07921: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.07928: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.07936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.07945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.07953: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.07956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.07967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.07980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882385.07984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.08033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.08060: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.08063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.08157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.09879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.09936: stderr chunk (state=3): >>><<< 11124 1726882385.09959: stdout chunk (state=3): >>><<< 11124 1726882385.09969: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.09972: _low_level_execute_command(): starting 11124 1726882385.09978: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967/AnsiballZ_command.py && sleep 0' 11124 1726882385.10610: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.10619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.10629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.10642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.10685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.10692: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.10705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.10717: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.10724: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.10731: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.10738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.10747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.10760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.10769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.10776: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.10786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.10860: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.10872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.10886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.11043: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.24478: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-20 21:33:05.239951", "end": "2024-09-20 21:33:05.243380", "delta": "0:00:00.003429", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882385.25636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882385.25696: stderr chunk (state=3): >>><<< 11124 1726882385.25699: stdout chunk (state=3): >>><<< 11124 1726882385.25716: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-20 21:33:05.239951", "end": "2024-09-20 21:33:05.243380", "delta": "0:00:00.003429", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882385.25748: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882385.25757: _low_level_execute_command(): starting 11124 1726882385.25762: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882384.989382-12323-127266804750967/ > /dev/null 2>&1 && sleep 0' 11124 1726882385.26228: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.26235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.26286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.26289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.26292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.26336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.26354: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.26458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.28294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.28344: stderr chunk (state=3): >>><<< 11124 1726882385.28347: stdout chunk (state=3): >>><<< 11124 1726882385.28366: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.28372: handler run complete 11124 1726882385.28392: Evaluated conditional (False): False 11124 1726882385.28508: variable 'result' from source: unknown 11124 1726882385.28520: Evaluated conditional ('110' in result.stdout): True 11124 1726882385.28530: attempt loop complete, returning result 11124 1726882385.28533: _execute() done 11124 1726882385.28535: dumping result to json 11124 1726882385.28540: done dumping result, returning 11124 1726882385.28547: done running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval [0e448fcc-3ce9-8362-0f62-000000000071] 11124 1726882385.28554: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000071 11124 1726882385.28651: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000071 11124 1726882385.28654: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/deprecated-bond" ], "delta": "0:00:00.003429", "end": "2024-09-20 21:33:05.243380", "rc": 0, "start": "2024-09-20 21:33:05.239951" } STDOUT: MII Polling Interval (ms): 110 11124 1726882385.28729: no more pending results, returning what we have 11124 1726882385.28733: results queue empty 11124 1726882385.28734: checking for any_errors_fatal 11124 1726882385.28739: done checking for any_errors_fatal 11124 1726882385.28740: checking for max_fail_percentage 11124 1726882385.28742: done checking for max_fail_percentage 11124 1726882385.28742: checking to see if all hosts have failed and the running result is not ok 11124 1726882385.28744: done checking to see if all hosts have failed 11124 1726882385.28745: getting the remaining hosts for this loop 11124 1726882385.28746: done getting the remaining hosts for this loop 11124 1726882385.28749: getting the next task for host managed_node1 11124 1726882385.28756: done getting next task for host managed_node1 11124 1726882385.28759: ^ task is: TASK: ** TEST check IPv4 11124 1726882385.28760: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882385.28766: getting variables 11124 1726882385.28768: in VariableManager get_vars() 11124 1726882385.28806: Calling all_inventory to load vars for managed_node1 11124 1726882385.28809: Calling groups_inventory to load vars for managed_node1 11124 1726882385.28811: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882385.28821: Calling all_plugins_play to load vars for managed_node1 11124 1726882385.28823: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882385.28826: Calling groups_plugins_play to load vars for managed_node1 11124 1726882385.29661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882385.30676: done with get_vars() 11124 1726882385.30693: done getting variables 11124 1726882385.30738: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:80 Friday 20 September 2024 21:33:05 -0400 (0:00:00.371) 0:00:25.550 ****** 11124 1726882385.30759: entering _queue_task() for managed_node1/command 11124 1726882385.30992: worker is 1 (out of 1 available) 11124 1726882385.31005: exiting _queue_task() for managed_node1/command 11124 1726882385.31018: done queuing things up, now waiting for results queue to drain 11124 1726882385.31020: waiting for pending results... 11124 1726882385.31201: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 11124 1726882385.31266: in run() - task 0e448fcc-3ce9-8362-0f62-000000000072 11124 1726882385.31278: variable 'ansible_search_path' from source: unknown 11124 1726882385.31308: calling self._execute() 11124 1726882385.31384: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882385.31391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882385.31398: variable 'omit' from source: magic vars 11124 1726882385.31668: variable 'ansible_distribution_major_version' from source: facts 11124 1726882385.31679: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882385.31687: variable 'omit' from source: magic vars 11124 1726882385.31703: variable 'omit' from source: magic vars 11124 1726882385.31768: variable 'controller_device' from source: play vars 11124 1726882385.31781: variable 'omit' from source: magic vars 11124 1726882385.31818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882385.31843: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882385.31862: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882385.31878: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882385.31887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882385.31913: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882385.31916: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882385.31919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882385.31987: Set connection var ansible_shell_executable to /bin/sh 11124 1726882385.31994: Set connection var ansible_shell_type to sh 11124 1726882385.32001: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882385.32005: Set connection var ansible_timeout to 10 11124 1726882385.32011: Set connection var ansible_pipelining to False 11124 1726882385.32015: Set connection var ansible_connection to ssh 11124 1726882385.32032: variable 'ansible_shell_executable' from source: unknown 11124 1726882385.32035: variable 'ansible_connection' from source: unknown 11124 1726882385.32038: variable 'ansible_module_compression' from source: unknown 11124 1726882385.32040: variable 'ansible_shell_type' from source: unknown 11124 1726882385.32042: variable 'ansible_shell_executable' from source: unknown 11124 1726882385.32044: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882385.32048: variable 'ansible_pipelining' from source: unknown 11124 1726882385.32051: variable 'ansible_timeout' from source: unknown 11124 1726882385.32057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882385.32159: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882385.32169: variable 'omit' from source: magic vars 11124 1726882385.32174: starting attempt loop 11124 1726882385.32176: running the handler 11124 1726882385.32189: _low_level_execute_command(): starting 11124 1726882385.32196: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882385.32733: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.32743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.32773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.32788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.32799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.32846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.32854: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.32870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.32977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.34593: stdout chunk (state=3): >>>/root <<< 11124 1726882385.34710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.34747: stderr chunk (state=3): >>><<< 11124 1726882385.34753: stdout chunk (state=3): >>><<< 11124 1726882385.34772: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.34783: _low_level_execute_command(): starting 11124 1726882385.34789: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843 `" && echo ansible-tmp-1726882385.347717-12335-79977621226843="` echo /root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843 `" ) && sleep 0' 11124 1726882385.35474: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.35599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.37422: stdout chunk (state=3): >>>ansible-tmp-1726882385.347717-12335-79977621226843=/root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843 <<< 11124 1726882385.37537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.37617: stderr chunk (state=3): >>><<< 11124 1726882385.37626: stdout chunk (state=3): >>><<< 11124 1726882385.37653: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882385.347717-12335-79977621226843=/root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.37690: variable 'ansible_module_compression' from source: unknown 11124 1726882385.37754: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882385.37795: variable 'ansible_facts' from source: unknown 11124 1726882385.37882: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843/AnsiballZ_command.py 11124 1726882385.38041: Sending initial data 11124 1726882385.38044: Sent initial data (154 bytes) 11124 1726882385.38991: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.39004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.39016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.39033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.39080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.39091: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.39103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.39120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.39130: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.39139: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.39149: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.39167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.39185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.39196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.39206: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.39217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.39296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.39316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.39329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.39451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.41248: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882385.41334: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882385.41426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpjx65q02y /root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843/AnsiballZ_command.py <<< 11124 1726882385.41515: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882385.42499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.42602: stderr chunk (state=3): >>><<< 11124 1726882385.42605: stdout chunk (state=3): >>><<< 11124 1726882385.42624: done transferring module to remote 11124 1726882385.42632: _low_level_execute_command(): starting 11124 1726882385.42636: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843/ /root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843/AnsiballZ_command.py && sleep 0' 11124 1726882385.43252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.43261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.43302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.43308: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.43321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.43327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.43403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.43409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.43423: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.43545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.45406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.45410: stdout chunk (state=3): >>><<< 11124 1726882385.45418: stderr chunk (state=3): >>><<< 11124 1726882385.45432: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.45435: _low_level_execute_command(): starting 11124 1726882385.45441: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843/AnsiballZ_command.py && sleep 0' 11124 1726882385.46070: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.46079: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.46089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.46102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.46139: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.46145: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.46158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.46174: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.46181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.46188: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.46195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.46204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.46216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.46222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.46229: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.46237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.46314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.46329: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.46340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.46478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.60292: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.101/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 237sec preferred_lft 237sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-20 21:33:05.597578", "end": "2024-09-20 21:33:05.601398", "delta": "0:00:00.003820", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882385.61559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882385.61565: stdout chunk (state=3): >>><<< 11124 1726882385.61568: stderr chunk (state=3): >>><<< 11124 1726882385.61711: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.101/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 237sec preferred_lft 237sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-20 21:33:05.597578", "end": "2024-09-20 21:33:05.601398", "delta": "0:00:00.003820", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882385.61715: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882385.61718: _low_level_execute_command(): starting 11124 1726882385.61721: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882385.347717-12335-79977621226843/ > /dev/null 2>&1 && sleep 0' 11124 1726882385.62284: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.62298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.62313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.62335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.62382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.62393: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.62405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.62420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.62431: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.62441: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.62454: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.62470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.62485: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.62494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.62504: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.62515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.62594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.62616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.62632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.62756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.64572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.64667: stderr chunk (state=3): >>><<< 11124 1726882385.64679: stdout chunk (state=3): >>><<< 11124 1726882385.64872: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.64875: handler run complete 11124 1726882385.64878: Evaluated conditional (False): False 11124 1726882385.64907: variable 'result' from source: set_fact 11124 1726882385.64927: Evaluated conditional ('192.0.2' in result.stdout): True 11124 1726882385.64942: attempt loop complete, returning result 11124 1726882385.64949: _execute() done 11124 1726882385.64958: dumping result to json 11124 1726882385.64969: done dumping result, returning 11124 1726882385.64984: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 [0e448fcc-3ce9-8362-0f62-000000000072] 11124 1726882385.64995: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000072 ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003820", "end": "2024-09-20 21:33:05.601398", "rc": 0, "start": "2024-09-20 21:33:05.597578" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.101/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond valid_lft 237sec preferred_lft 237sec 11124 1726882385.65195: no more pending results, returning what we have 11124 1726882385.65198: results queue empty 11124 1726882385.65199: checking for any_errors_fatal 11124 1726882385.65207: done checking for any_errors_fatal 11124 1726882385.65208: checking for max_fail_percentage 11124 1726882385.65210: done checking for max_fail_percentage 11124 1726882385.65211: checking to see if all hosts have failed and the running result is not ok 11124 1726882385.65212: done checking to see if all hosts have failed 11124 1726882385.65213: getting the remaining hosts for this loop 11124 1726882385.65214: done getting the remaining hosts for this loop 11124 1726882385.65219: getting the next task for host managed_node1 11124 1726882385.65225: done getting next task for host managed_node1 11124 1726882385.65228: ^ task is: TASK: ** TEST check IPv6 11124 1726882385.65230: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882385.65234: getting variables 11124 1726882385.65235: in VariableManager get_vars() 11124 1726882385.65283: Calling all_inventory to load vars for managed_node1 11124 1726882385.65286: Calling groups_inventory to load vars for managed_node1 11124 1726882385.65288: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882385.65300: Calling all_plugins_play to load vars for managed_node1 11124 1726882385.65303: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882385.65306: Calling groups_plugins_play to load vars for managed_node1 11124 1726882385.66089: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000072 11124 1726882385.66093: WORKER PROCESS EXITING 11124 1726882385.67140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882385.68943: done with get_vars() 11124 1726882385.68983: done getting variables 11124 1726882385.69041: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:87 Friday 20 September 2024 21:33:05 -0400 (0:00:00.383) 0:00:25.933 ****** 11124 1726882385.69074: entering _queue_task() for managed_node1/command 11124 1726882385.69411: worker is 1 (out of 1 available) 11124 1726882385.69424: exiting _queue_task() for managed_node1/command 11124 1726882385.69435: done queuing things up, now waiting for results queue to drain 11124 1726882385.69437: waiting for pending results... 11124 1726882385.69722: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 11124 1726882385.69830: in run() - task 0e448fcc-3ce9-8362-0f62-000000000073 11124 1726882385.69860: variable 'ansible_search_path' from source: unknown 11124 1726882385.69909: calling self._execute() 11124 1726882385.70025: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882385.70036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882385.70052: variable 'omit' from source: magic vars 11124 1726882385.70454: variable 'ansible_distribution_major_version' from source: facts 11124 1726882385.70475: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882385.70486: variable 'omit' from source: magic vars 11124 1726882385.70516: variable 'omit' from source: magic vars 11124 1726882385.70625: variable 'controller_device' from source: play vars 11124 1726882385.70654: variable 'omit' from source: magic vars 11124 1726882385.70705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882385.70752: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882385.70781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882385.70803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882385.70820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882385.70862: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882385.70874: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882385.70882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882385.70997: Set connection var ansible_shell_executable to /bin/sh 11124 1726882385.71011: Set connection var ansible_shell_type to sh 11124 1726882385.71023: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882385.71033: Set connection var ansible_timeout to 10 11124 1726882385.71048: Set connection var ansible_pipelining to False 11124 1726882385.71058: Set connection var ansible_connection to ssh 11124 1726882385.71089: variable 'ansible_shell_executable' from source: unknown 11124 1726882385.71096: variable 'ansible_connection' from source: unknown 11124 1726882385.71104: variable 'ansible_module_compression' from source: unknown 11124 1726882385.71111: variable 'ansible_shell_type' from source: unknown 11124 1726882385.71118: variable 'ansible_shell_executable' from source: unknown 11124 1726882385.71124: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882385.71132: variable 'ansible_pipelining' from source: unknown 11124 1726882385.71139: variable 'ansible_timeout' from source: unknown 11124 1726882385.71148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882385.71307: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882385.71324: variable 'omit' from source: magic vars 11124 1726882385.71334: starting attempt loop 11124 1726882385.71341: running the handler 11124 1726882385.71370: _low_level_execute_command(): starting 11124 1726882385.71384: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882385.72200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.72216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.72232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.72255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.72312: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.72325: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.72340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.72365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.72383: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.72400: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.72413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.72428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.72445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.72462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.72478: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.72498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.72581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.72609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.72629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.72755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.74348: stdout chunk (state=3): >>>/root <<< 11124 1726882385.74520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.74526: stderr chunk (state=3): >>><<< 11124 1726882385.74528: stdout chunk (state=3): >>><<< 11124 1726882385.74557: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.74571: _low_level_execute_command(): starting 11124 1726882385.74578: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687 `" && echo ansible-tmp-1726882385.745555-12349-62247264695687="` echo /root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687 `" ) && sleep 0' 11124 1726882385.75227: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.75236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.75246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.75260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.75302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.75313: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.75322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.75335: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.75343: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.75352: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.75358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.75369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.75381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.75388: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.75395: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.75403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.75481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.75499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.75509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.75630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.77510: stdout chunk (state=3): >>>ansible-tmp-1726882385.745555-12349-62247264695687=/root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687 <<< 11124 1726882385.77616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.77686: stderr chunk (state=3): >>><<< 11124 1726882385.77689: stdout chunk (state=3): >>><<< 11124 1726882385.77709: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882385.745555-12349-62247264695687=/root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.77742: variable 'ansible_module_compression' from source: unknown 11124 1726882385.77795: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882385.77831: variable 'ansible_facts' from source: unknown 11124 1726882385.77902: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687/AnsiballZ_command.py 11124 1726882385.78039: Sending initial data 11124 1726882385.78042: Sent initial data (154 bytes) 11124 1726882385.78955: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.78961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.78973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.78986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.79022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.79029: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.79037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.79053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.79056: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.79066: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.79076: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.79083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.79095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.79101: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.79108: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.79116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.79190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.79207: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.79219: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.79337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.81116: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882385.81207: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882385.81302: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp_x33sn0s /root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687/AnsiballZ_command.py <<< 11124 1726882385.81396: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882385.82739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.82938: stderr chunk (state=3): >>><<< 11124 1726882385.82942: stdout chunk (state=3): >>><<< 11124 1726882385.82944: done transferring module to remote 11124 1726882385.82946: _low_level_execute_command(): starting 11124 1726882385.82956: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687/ /root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687/AnsiballZ_command.py && sleep 0' 11124 1726882385.83562: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.83580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.83601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.83625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.83675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.83689: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.83708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.83731: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.83743: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.83757: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.83778: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.83793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.83814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.83831: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.83844: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.83862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.83946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.83974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.83991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.84117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882385.85903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882385.85990: stderr chunk (state=3): >>><<< 11124 1726882385.86004: stdout chunk (state=3): >>><<< 11124 1726882385.86117: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882385.86120: _low_level_execute_command(): starting 11124 1726882385.86125: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687/AnsiballZ_command.py && sleep 0' 11124 1726882385.86838: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882385.86855: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.86873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.86899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.86941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.87001: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882385.87019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.87037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882385.87052: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882385.87067: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882385.87083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882385.87103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882385.87122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882385.87135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882385.87146: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882385.87167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882385.87252: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882385.87278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882385.87295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882385.87456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882386.01067: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::184/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::a86f:8f6f:13af:7bf1/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::a397:b12b:3132:ceb4/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-20 21:33:06.005631", "end": "2024-09-20 21:33:06.009196", "delta": "0:00:00.003565", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882386.02374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882386.02378: stdout chunk (state=3): >>><<< 11124 1726882386.02380: stderr chunk (state=3): >>><<< 11124 1726882386.02526: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::184/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::a86f:8f6f:13af:7bf1/64 scope global dynamic noprefixroute \n valid_lft 1799sec preferred_lft 1799sec\n inet6 fe80::a397:b12b:3132:ceb4/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-20 21:33:06.005631", "end": "2024-09-20 21:33:06.009196", "delta": "0:00:00.003565", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882386.02538: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882386.02541: _low_level_execute_command(): starting 11124 1726882386.02543: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882385.745555-12349-62247264695687/ > /dev/null 2>&1 && sleep 0' 11124 1726882386.03124: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882386.03138: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.03155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.03175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.03216: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.03231: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882386.03244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.03269: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882386.03281: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882386.03291: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882386.03302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.03313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.03327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.03338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.03348: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882386.03367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.03443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882386.03466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882386.03481: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882386.03610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882386.05514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882386.05518: stdout chunk (state=3): >>><<< 11124 1726882386.05524: stderr chunk (state=3): >>><<< 11124 1726882386.05549: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882386.05555: handler run complete 11124 1726882386.05586: Evaluated conditional (False): False 11124 1726882386.05744: variable 'result' from source: set_fact 11124 1726882386.05760: Evaluated conditional ('2001' in result.stdout): True 11124 1726882386.05775: attempt loop complete, returning result 11124 1726882386.05785: _execute() done 11124 1726882386.05788: dumping result to json 11124 1726882386.05794: done dumping result, returning 11124 1726882386.05803: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 [0e448fcc-3ce9-8362-0f62-000000000073] 11124 1726882386.05807: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000073 ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003565", "end": "2024-09-20 21:33:06.009196", "rc": 0, "start": "2024-09-20 21:33:06.005631" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::184/128 scope global dynamic noprefixroute valid_lft 236sec preferred_lft 236sec inet6 2001:db8::a86f:8f6f:13af:7bf1/64 scope global dynamic noprefixroute valid_lft 1799sec preferred_lft 1799sec inet6 fe80::a397:b12b:3132:ceb4/64 scope link noprefixroute valid_lft forever preferred_lft forever 11124 1726882386.06005: no more pending results, returning what we have 11124 1726882386.06009: results queue empty 11124 1726882386.06010: checking for any_errors_fatal 11124 1726882386.06017: done checking for any_errors_fatal 11124 1726882386.06018: checking for max_fail_percentage 11124 1726882386.06020: done checking for max_fail_percentage 11124 1726882386.06021: checking to see if all hosts have failed and the running result is not ok 11124 1726882386.06022: done checking to see if all hosts have failed 11124 1726882386.06023: getting the remaining hosts for this loop 11124 1726882386.06024: done getting the remaining hosts for this loop 11124 1726882386.06028: getting the next task for host managed_node1 11124 1726882386.06040: done getting next task for host managed_node1 11124 1726882386.06045: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11124 1726882386.06049: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882386.06071: getting variables 11124 1726882386.06073: in VariableManager get_vars() 11124 1726882386.06116: Calling all_inventory to load vars for managed_node1 11124 1726882386.06119: Calling groups_inventory to load vars for managed_node1 11124 1726882386.06126: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882386.06138: Calling all_plugins_play to load vars for managed_node1 11124 1726882386.06141: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882386.06144: Calling groups_plugins_play to load vars for managed_node1 11124 1726882386.06781: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000073 11124 1726882386.06785: WORKER PROCESS EXITING 11124 1726882386.08086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882386.09828: done with get_vars() 11124 1726882386.09873: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:33:06 -0400 (0:00:00.409) 0:00:26.342 ****** 11124 1726882386.09989: entering _queue_task() for managed_node1/include_tasks 11124 1726882386.10361: worker is 1 (out of 1 available) 11124 1726882386.10375: exiting _queue_task() for managed_node1/include_tasks 11124 1726882386.10391: done queuing things up, now waiting for results queue to drain 11124 1726882386.10392: waiting for pending results... 11124 1726882386.10676: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11124 1726882386.10842: in run() - task 0e448fcc-3ce9-8362-0f62-00000000007d 11124 1726882386.10861: variable 'ansible_search_path' from source: unknown 11124 1726882386.10870: variable 'ansible_search_path' from source: unknown 11124 1726882386.10910: calling self._execute() 11124 1726882386.11010: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882386.11020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882386.11031: variable 'omit' from source: magic vars 11124 1726882386.11400: variable 'ansible_distribution_major_version' from source: facts 11124 1726882386.11416: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882386.11428: _execute() done 11124 1726882386.11436: dumping result to json 11124 1726882386.11443: done dumping result, returning 11124 1726882386.11453: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0e448fcc-3ce9-8362-0f62-00000000007d] 11124 1726882386.11468: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000007d 11124 1726882386.11573: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000007d 11124 1726882386.11585: WORKER PROCESS EXITING 11124 1726882386.11631: no more pending results, returning what we have 11124 1726882386.11636: in VariableManager get_vars() 11124 1726882386.11689: Calling all_inventory to load vars for managed_node1 11124 1726882386.11692: Calling groups_inventory to load vars for managed_node1 11124 1726882386.11694: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882386.11708: Calling all_plugins_play to load vars for managed_node1 11124 1726882386.11712: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882386.11715: Calling groups_plugins_play to load vars for managed_node1 11124 1726882386.13505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882386.15269: done with get_vars() 11124 1726882386.15297: variable 'ansible_search_path' from source: unknown 11124 1726882386.15299: variable 'ansible_search_path' from source: unknown 11124 1726882386.15348: we have included files to process 11124 1726882386.15349: generating all_blocks data 11124 1726882386.15351: done generating all_blocks data 11124 1726882386.15356: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11124 1726882386.15357: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11124 1726882386.15360: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11124 1726882386.15959: done processing included file 11124 1726882386.15961: iterating over new_blocks loaded from include file 11124 1726882386.15962: in VariableManager get_vars() 11124 1726882386.15996: done with get_vars() 11124 1726882386.15998: filtering new block on tags 11124 1726882386.16029: done filtering new block on tags 11124 1726882386.16032: in VariableManager get_vars() 11124 1726882386.16054: done with get_vars() 11124 1726882386.16056: filtering new block on tags 11124 1726882386.16102: done filtering new block on tags 11124 1726882386.16105: in VariableManager get_vars() 11124 1726882386.16128: done with get_vars() 11124 1726882386.16129: filtering new block on tags 11124 1726882386.16170: done filtering new block on tags 11124 1726882386.16172: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 11124 1726882386.16177: extending task lists for all hosts with included blocks 11124 1726882386.17326: done extending task lists 11124 1726882386.17327: done processing included files 11124 1726882386.17328: results queue empty 11124 1726882386.17329: checking for any_errors_fatal 11124 1726882386.17333: done checking for any_errors_fatal 11124 1726882386.17334: checking for max_fail_percentage 11124 1726882386.17335: done checking for max_fail_percentage 11124 1726882386.17336: checking to see if all hosts have failed and the running result is not ok 11124 1726882386.17337: done checking to see if all hosts have failed 11124 1726882386.17337: getting the remaining hosts for this loop 11124 1726882386.17338: done getting the remaining hosts for this loop 11124 1726882386.17341: getting the next task for host managed_node1 11124 1726882386.17345: done getting next task for host managed_node1 11124 1726882386.17348: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11124 1726882386.17351: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882386.17361: getting variables 11124 1726882386.17362: in VariableManager get_vars() 11124 1726882386.17380: Calling all_inventory to load vars for managed_node1 11124 1726882386.17382: Calling groups_inventory to load vars for managed_node1 11124 1726882386.17384: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882386.17390: Calling all_plugins_play to load vars for managed_node1 11124 1726882386.17393: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882386.17395: Calling groups_plugins_play to load vars for managed_node1 11124 1726882386.18615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882386.20327: done with get_vars() 11124 1726882386.20365: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:33:06 -0400 (0:00:00.104) 0:00:26.447 ****** 11124 1726882386.20459: entering _queue_task() for managed_node1/setup 11124 1726882386.20820: worker is 1 (out of 1 available) 11124 1726882386.20831: exiting _queue_task() for managed_node1/setup 11124 1726882386.20842: done queuing things up, now waiting for results queue to drain 11124 1726882386.20843: waiting for pending results... 11124 1726882386.21283: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11124 1726882386.21470: in run() - task 0e448fcc-3ce9-8362-0f62-000000000494 11124 1726882386.21489: variable 'ansible_search_path' from source: unknown 11124 1726882386.21496: variable 'ansible_search_path' from source: unknown 11124 1726882386.21542: calling self._execute() 11124 1726882386.21666: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882386.21678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882386.21692: variable 'omit' from source: magic vars 11124 1726882386.22078: variable 'ansible_distribution_major_version' from source: facts 11124 1726882386.22095: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882386.22316: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882386.24895: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882386.24977: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882386.25025: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882386.25065: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882386.25099: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882386.25182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882386.25224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882386.25254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882386.25302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882386.25331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882386.25425: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882386.25455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882386.25487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882386.25562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882386.25583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882386.25754: variable '__network_required_facts' from source: role '' defaults 11124 1726882386.25775: variable 'ansible_facts' from source: unknown 11124 1726882386.27848: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11124 1726882386.27857: when evaluation is False, skipping this task 11124 1726882386.27868: _execute() done 11124 1726882386.27875: dumping result to json 11124 1726882386.27881: done dumping result, returning 11124 1726882386.27899: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0e448fcc-3ce9-8362-0f62-000000000494] 11124 1726882386.27908: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000494 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882386.28052: no more pending results, returning what we have 11124 1726882386.28056: results queue empty 11124 1726882386.28058: checking for any_errors_fatal 11124 1726882386.28059: done checking for any_errors_fatal 11124 1726882386.28060: checking for max_fail_percentage 11124 1726882386.28061: done checking for max_fail_percentage 11124 1726882386.28062: checking to see if all hosts have failed and the running result is not ok 11124 1726882386.28066: done checking to see if all hosts have failed 11124 1726882386.28067: getting the remaining hosts for this loop 11124 1726882386.28069: done getting the remaining hosts for this loop 11124 1726882386.28073: getting the next task for host managed_node1 11124 1726882386.28084: done getting next task for host managed_node1 11124 1726882386.28088: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11124 1726882386.28094: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882386.28113: getting variables 11124 1726882386.28115: in VariableManager get_vars() 11124 1726882386.28159: Calling all_inventory to load vars for managed_node1 11124 1726882386.28162: Calling groups_inventory to load vars for managed_node1 11124 1726882386.28167: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882386.28179: Calling all_plugins_play to load vars for managed_node1 11124 1726882386.28182: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882386.28185: Calling groups_plugins_play to load vars for managed_node1 11124 1726882386.29201: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000494 11124 1726882386.29205: WORKER PROCESS EXITING 11124 1726882386.30196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882386.32787: done with get_vars() 11124 1726882386.32819: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:33:06 -0400 (0:00:00.124) 0:00:26.571 ****** 11124 1726882386.32919: entering _queue_task() for managed_node1/stat 11124 1726882386.33243: worker is 1 (out of 1 available) 11124 1726882386.33255: exiting _queue_task() for managed_node1/stat 11124 1726882386.33268: done queuing things up, now waiting for results queue to drain 11124 1726882386.33270: waiting for pending results... 11124 1726882386.33947: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 11124 1726882386.34110: in run() - task 0e448fcc-3ce9-8362-0f62-000000000496 11124 1726882386.34124: variable 'ansible_search_path' from source: unknown 11124 1726882386.34127: variable 'ansible_search_path' from source: unknown 11124 1726882386.34168: calling self._execute() 11124 1726882386.34262: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882386.34267: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882386.34279: variable 'omit' from source: magic vars 11124 1726882386.34908: variable 'ansible_distribution_major_version' from source: facts 11124 1726882386.34915: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882386.35091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882386.35377: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882386.35427: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882386.35465: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882386.35502: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882386.35619: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882386.35642: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882386.35674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882386.35699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882386.35793: variable '__network_is_ostree' from source: set_fact 11124 1726882386.35799: Evaluated conditional (not __network_is_ostree is defined): False 11124 1726882386.35802: when evaluation is False, skipping this task 11124 1726882386.35805: _execute() done 11124 1726882386.35807: dumping result to json 11124 1726882386.35809: done dumping result, returning 11124 1726882386.35820: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0e448fcc-3ce9-8362-0f62-000000000496] 11124 1726882386.35830: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000496 11124 1726882386.35927: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000496 11124 1726882386.35930: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11124 1726882386.36005: no more pending results, returning what we have 11124 1726882386.36009: results queue empty 11124 1726882386.36010: checking for any_errors_fatal 11124 1726882386.36017: done checking for any_errors_fatal 11124 1726882386.36017: checking for max_fail_percentage 11124 1726882386.36019: done checking for max_fail_percentage 11124 1726882386.36020: checking to see if all hosts have failed and the running result is not ok 11124 1726882386.36021: done checking to see if all hosts have failed 11124 1726882386.36022: getting the remaining hosts for this loop 11124 1726882386.36023: done getting the remaining hosts for this loop 11124 1726882386.36027: getting the next task for host managed_node1 11124 1726882386.36034: done getting next task for host managed_node1 11124 1726882386.36037: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11124 1726882386.36042: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882386.36061: getting variables 11124 1726882386.36062: in VariableManager get_vars() 11124 1726882386.36102: Calling all_inventory to load vars for managed_node1 11124 1726882386.36104: Calling groups_inventory to load vars for managed_node1 11124 1726882386.36106: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882386.36116: Calling all_plugins_play to load vars for managed_node1 11124 1726882386.36118: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882386.36121: Calling groups_plugins_play to load vars for managed_node1 11124 1726882386.38647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882386.42307: done with get_vars() 11124 1726882386.42424: done getting variables 11124 1726882386.42518: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:33:06 -0400 (0:00:00.096) 0:00:26.668 ****** 11124 1726882386.42672: entering _queue_task() for managed_node1/set_fact 11124 1726882386.43394: worker is 1 (out of 1 available) 11124 1726882386.43405: exiting _queue_task() for managed_node1/set_fact 11124 1726882386.43417: done queuing things up, now waiting for results queue to drain 11124 1726882386.43419: waiting for pending results... 11124 1726882386.44378: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11124 1726882386.44703: in run() - task 0e448fcc-3ce9-8362-0f62-000000000497 11124 1726882386.44714: variable 'ansible_search_path' from source: unknown 11124 1726882386.44717: variable 'ansible_search_path' from source: unknown 11124 1726882386.44872: calling self._execute() 11124 1726882386.45100: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882386.45112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882386.45127: variable 'omit' from source: magic vars 11124 1726882386.45554: variable 'ansible_distribution_major_version' from source: facts 11124 1726882386.45575: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882386.45757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882386.46049: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882386.46102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882386.46138: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882386.46186: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882386.46280: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882386.46310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882386.46342: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882386.46383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882386.46476: variable '__network_is_ostree' from source: set_fact 11124 1726882386.46492: Evaluated conditional (not __network_is_ostree is defined): False 11124 1726882386.46500: when evaluation is False, skipping this task 11124 1726882386.46506: _execute() done 11124 1726882386.46514: dumping result to json 11124 1726882386.46521: done dumping result, returning 11124 1726882386.46532: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0e448fcc-3ce9-8362-0f62-000000000497] 11124 1726882386.46542: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000497 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11124 1726882386.46695: no more pending results, returning what we have 11124 1726882386.46699: results queue empty 11124 1726882386.46700: checking for any_errors_fatal 11124 1726882386.46708: done checking for any_errors_fatal 11124 1726882386.46709: checking for max_fail_percentage 11124 1726882386.46710: done checking for max_fail_percentage 11124 1726882386.46711: checking to see if all hosts have failed and the running result is not ok 11124 1726882386.46713: done checking to see if all hosts have failed 11124 1726882386.46714: getting the remaining hosts for this loop 11124 1726882386.46715: done getting the remaining hosts for this loop 11124 1726882386.46720: getting the next task for host managed_node1 11124 1726882386.46732: done getting next task for host managed_node1 11124 1726882386.46736: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11124 1726882386.46743: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882386.46766: getting variables 11124 1726882386.46768: in VariableManager get_vars() 11124 1726882386.46846: Calling all_inventory to load vars for managed_node1 11124 1726882386.46849: Calling groups_inventory to load vars for managed_node1 11124 1726882386.46852: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882386.46866: Calling all_plugins_play to load vars for managed_node1 11124 1726882386.46870: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882386.46873: Calling groups_plugins_play to load vars for managed_node1 11124 1726882386.48108: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000497 11124 1726882386.48112: WORKER PROCESS EXITING 11124 1726882386.54742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882386.55698: done with get_vars() 11124 1726882386.55717: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:33:06 -0400 (0:00:00.131) 0:00:26.800 ****** 11124 1726882386.55781: entering _queue_task() for managed_node1/service_facts 11124 1726882386.56042: worker is 1 (out of 1 available) 11124 1726882386.56055: exiting _queue_task() for managed_node1/service_facts 11124 1726882386.56068: done queuing things up, now waiting for results queue to drain 11124 1726882386.56069: waiting for pending results... 11124 1726882386.56327: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 11124 1726882386.56515: in run() - task 0e448fcc-3ce9-8362-0f62-000000000499 11124 1726882386.56535: variable 'ansible_search_path' from source: unknown 11124 1726882386.56547: variable 'ansible_search_path' from source: unknown 11124 1726882386.56593: calling self._execute() 11124 1726882386.56702: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882386.56712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882386.56725: variable 'omit' from source: magic vars 11124 1726882386.57109: variable 'ansible_distribution_major_version' from source: facts 11124 1726882386.57126: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882386.57140: variable 'omit' from source: magic vars 11124 1726882386.57229: variable 'omit' from source: magic vars 11124 1726882386.57272: variable 'omit' from source: magic vars 11124 1726882386.57324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882386.57364: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882386.57389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882386.57413: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882386.57434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882386.57470: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882386.57478: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882386.57489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882386.57596: Set connection var ansible_shell_executable to /bin/sh 11124 1726882386.57613: Set connection var ansible_shell_type to sh 11124 1726882386.57624: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882386.57635: Set connection var ansible_timeout to 10 11124 1726882386.57646: Set connection var ansible_pipelining to False 11124 1726882386.57653: Set connection var ansible_connection to ssh 11124 1726882386.57680: variable 'ansible_shell_executable' from source: unknown 11124 1726882386.57688: variable 'ansible_connection' from source: unknown 11124 1726882386.57694: variable 'ansible_module_compression' from source: unknown 11124 1726882386.57700: variable 'ansible_shell_type' from source: unknown 11124 1726882386.57706: variable 'ansible_shell_executable' from source: unknown 11124 1726882386.57712: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882386.57719: variable 'ansible_pipelining' from source: unknown 11124 1726882386.57724: variable 'ansible_timeout' from source: unknown 11124 1726882386.57730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882386.57931: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882386.57946: variable 'omit' from source: magic vars 11124 1726882386.57955: starting attempt loop 11124 1726882386.57962: running the handler 11124 1726882386.57996: _low_level_execute_command(): starting 11124 1726882386.58009: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882386.58866: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882386.58989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.59094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882386.59110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882386.59240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882386.60891: stdout chunk (state=3): >>>/root <<< 11124 1726882386.61053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882386.61056: stdout chunk (state=3): >>><<< 11124 1726882386.61058: stderr chunk (state=3): >>><<< 11124 1726882386.61170: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882386.61175: _low_level_execute_command(): starting 11124 1726882386.61179: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673 `" && echo ansible-tmp-1726882386.610836-12383-267749096624673="` echo /root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673 `" ) && sleep 0' 11124 1726882386.61770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882386.61785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.61802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.61820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.61892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.61905: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882386.61919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.62080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882386.62094: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882386.62106: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882386.62120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.62134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.62150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.62170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.62190: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882386.62205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.62288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882386.62312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882386.62329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882386.62454: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882386.64343: stdout chunk (state=3): >>>ansible-tmp-1726882386.610836-12383-267749096624673=/root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673 <<< 11124 1726882386.64480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882386.64555: stderr chunk (state=3): >>><<< 11124 1726882386.64559: stdout chunk (state=3): >>><<< 11124 1726882386.64674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882386.610836-12383-267749096624673=/root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882386.64679: variable 'ansible_module_compression' from source: unknown 11124 1726882386.64783: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11124 1726882386.64787: variable 'ansible_facts' from source: unknown 11124 1726882386.64824: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673/AnsiballZ_service_facts.py 11124 1726882386.64984: Sending initial data 11124 1726882386.64988: Sent initial data (161 bytes) 11124 1726882386.66021: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882386.66038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.66054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.66078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.66126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.66137: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882386.66150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.66169: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882386.66180: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882386.66190: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882386.66208: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.66223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.66239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.66252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.66268: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882386.66282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.66366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882386.66390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882386.66407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882386.66541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882386.68301: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882386.68388: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882386.68482: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp07qe0mp1 /root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673/AnsiballZ_service_facts.py <<< 11124 1726882386.68573: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882386.69854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882386.70134: stderr chunk (state=3): >>><<< 11124 1726882386.70137: stdout chunk (state=3): >>><<< 11124 1726882386.70140: done transferring module to remote 11124 1726882386.70142: _low_level_execute_command(): starting 11124 1726882386.70144: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673/ /root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673/AnsiballZ_service_facts.py && sleep 0' 11124 1726882386.70761: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882386.70783: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.70803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.70823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.70868: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.70888: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882386.70909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.70927: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882386.70938: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882386.70950: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882386.70962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.70979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.70998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.71014: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.71028: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882386.71041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.71122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882386.71149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882386.71167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882386.71293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882386.73085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882386.73158: stderr chunk (state=3): >>><<< 11124 1726882386.73161: stdout chunk (state=3): >>><<< 11124 1726882386.73188: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882386.73191: _low_level_execute_command(): starting 11124 1726882386.73197: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673/AnsiballZ_service_facts.py && sleep 0' 11124 1726882386.73899: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882386.73907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.73918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.73931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.73981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.73988: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882386.73998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.74011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882386.74019: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882386.74025: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882386.74032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882386.74041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882386.74066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882386.74075: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882386.74081: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882386.74090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882386.74161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882386.74189: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882386.74201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882386.74328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882388.05947: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.serv<<< 11124 1726882388.05999: stdout chunk (state=3): >>>ice", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "s<<< 11124 1726882388.06009: stdout chunk (state=3): >>>tatic", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alia<<< 11124 1726882388.06012: stdout chunk (state=3): >>>s", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper<<< 11124 1726882388.06019: stdout chunk (state=3): >>>-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11124 1726882388.07276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882388.07335: stderr chunk (state=3): >>><<< 11124 1726882388.07338: stdout chunk (state=3): >>><<< 11124 1726882388.07368: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rhsmcertd.service": {"name": "rhsmcertd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhcd.service": {"name": "rhcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm-facts.service": {"name": "rhsm-facts.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rhsm.service": {"name": "rhsm.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882388.07760: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882388.07770: _low_level_execute_command(): starting 11124 1726882388.07775: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882386.610836-12383-267749096624673/ > /dev/null 2>&1 && sleep 0' 11124 1726882388.08238: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882388.08241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.08283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.08287: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.08290: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.08342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882388.08345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882388.08348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882388.08444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882388.10245: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882388.10298: stderr chunk (state=3): >>><<< 11124 1726882388.10301: stdout chunk (state=3): >>><<< 11124 1726882388.10314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882388.10320: handler run complete 11124 1726882388.10425: variable 'ansible_facts' from source: unknown 11124 1726882388.10523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882388.10772: variable 'ansible_facts' from source: unknown 11124 1726882388.10842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882388.10949: attempt loop complete, returning result 11124 1726882388.10956: _execute() done 11124 1726882388.10959: dumping result to json 11124 1726882388.10996: done dumping result, returning 11124 1726882388.11004: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0e448fcc-3ce9-8362-0f62-000000000499] 11124 1726882388.11009: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000499 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882388.11692: no more pending results, returning what we have 11124 1726882388.11694: results queue empty 11124 1726882388.11695: checking for any_errors_fatal 11124 1726882388.11699: done checking for any_errors_fatal 11124 1726882388.11699: checking for max_fail_percentage 11124 1726882388.11701: done checking for max_fail_percentage 11124 1726882388.11702: checking to see if all hosts have failed and the running result is not ok 11124 1726882388.11703: done checking to see if all hosts have failed 11124 1726882388.11703: getting the remaining hosts for this loop 11124 1726882388.11704: done getting the remaining hosts for this loop 11124 1726882388.11707: getting the next task for host managed_node1 11124 1726882388.11712: done getting next task for host managed_node1 11124 1726882388.11715: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11124 1726882388.11719: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882388.11728: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000499 11124 1726882388.11730: WORKER PROCESS EXITING 11124 1726882388.11736: getting variables 11124 1726882388.11737: in VariableManager get_vars() 11124 1726882388.11762: Calling all_inventory to load vars for managed_node1 11124 1726882388.11766: Calling groups_inventory to load vars for managed_node1 11124 1726882388.11768: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882388.11775: Calling all_plugins_play to load vars for managed_node1 11124 1726882388.11776: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882388.11778: Calling groups_plugins_play to load vars for managed_node1 11124 1726882388.12588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882388.13532: done with get_vars() 11124 1726882388.13549: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:33:08 -0400 (0:00:01.578) 0:00:28.378 ****** 11124 1726882388.13622: entering _queue_task() for managed_node1/package_facts 11124 1726882388.13846: worker is 1 (out of 1 available) 11124 1726882388.13858: exiting _queue_task() for managed_node1/package_facts 11124 1726882388.13873: done queuing things up, now waiting for results queue to drain 11124 1726882388.13874: waiting for pending results... 11124 1726882388.14057: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 11124 1726882388.14170: in run() - task 0e448fcc-3ce9-8362-0f62-00000000049a 11124 1726882388.14182: variable 'ansible_search_path' from source: unknown 11124 1726882388.14185: variable 'ansible_search_path' from source: unknown 11124 1726882388.14216: calling self._execute() 11124 1726882388.14290: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882388.14294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882388.14302: variable 'omit' from source: magic vars 11124 1726882388.14576: variable 'ansible_distribution_major_version' from source: facts 11124 1726882388.14586: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882388.14592: variable 'omit' from source: magic vars 11124 1726882388.14648: variable 'omit' from source: magic vars 11124 1726882388.14677: variable 'omit' from source: magic vars 11124 1726882388.14709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882388.14734: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882388.14756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882388.14768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882388.14777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882388.14800: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882388.14803: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882388.14806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882388.14877: Set connection var ansible_shell_executable to /bin/sh 11124 1726882388.14883: Set connection var ansible_shell_type to sh 11124 1726882388.14890: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882388.14895: Set connection var ansible_timeout to 10 11124 1726882388.14900: Set connection var ansible_pipelining to False 11124 1726882388.14902: Set connection var ansible_connection to ssh 11124 1726882388.14920: variable 'ansible_shell_executable' from source: unknown 11124 1726882388.14922: variable 'ansible_connection' from source: unknown 11124 1726882388.14925: variable 'ansible_module_compression' from source: unknown 11124 1726882388.14928: variable 'ansible_shell_type' from source: unknown 11124 1726882388.14930: variable 'ansible_shell_executable' from source: unknown 11124 1726882388.14934: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882388.14938: variable 'ansible_pipelining' from source: unknown 11124 1726882388.14941: variable 'ansible_timeout' from source: unknown 11124 1726882388.14945: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882388.15086: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882388.15092: variable 'omit' from source: magic vars 11124 1726882388.15101: starting attempt loop 11124 1726882388.15104: running the handler 11124 1726882388.15116: _low_level_execute_command(): starting 11124 1726882388.15122: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882388.15645: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.15672: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882388.15688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.15736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882388.15760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882388.15860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882388.17503: stdout chunk (state=3): >>>/root <<< 11124 1726882388.17607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882388.17661: stderr chunk (state=3): >>><<< 11124 1726882388.17670: stdout chunk (state=3): >>><<< 11124 1726882388.17693: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882388.17703: _low_level_execute_command(): starting 11124 1726882388.17710: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218 `" && echo ansible-tmp-1726882388.1769168-12431-129227357210218="` echo /root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218 `" ) && sleep 0' 11124 1726882388.18170: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882388.18176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.18209: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.18223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882388.18244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.18288: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882388.18300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882388.18406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882388.20290: stdout chunk (state=3): >>>ansible-tmp-1726882388.1769168-12431-129227357210218=/root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218 <<< 11124 1726882388.20398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882388.20446: stderr chunk (state=3): >>><<< 11124 1726882388.20450: stdout chunk (state=3): >>><<< 11124 1726882388.20468: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882388.1769168-12431-129227357210218=/root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882388.20513: variable 'ansible_module_compression' from source: unknown 11124 1726882388.20550: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11124 1726882388.20605: variable 'ansible_facts' from source: unknown 11124 1726882388.20739: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218/AnsiballZ_package_facts.py 11124 1726882388.20862: Sending initial data 11124 1726882388.20868: Sent initial data (162 bytes) 11124 1726882388.21540: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882388.21544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.21581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.21605: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882388.21608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882388.21611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.21669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882388.21672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882388.21674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882388.21774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882388.23534: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882388.23614: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882388.23715: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmppj7y7guq /root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218/AnsiballZ_package_facts.py <<< 11124 1726882388.23798: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882388.26807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882388.26896: stderr chunk (state=3): >>><<< 11124 1726882388.26900: stdout chunk (state=3): >>><<< 11124 1726882388.26922: done transferring module to remote 11124 1726882388.26933: _low_level_execute_command(): starting 11124 1726882388.26938: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218/ /root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218/AnsiballZ_package_facts.py && sleep 0' 11124 1726882388.27578: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882388.27586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882388.27597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882388.27610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.27648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882388.27655: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882388.27665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.27678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882388.27687: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882388.27693: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882388.27702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882388.27710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882388.27721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.27729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882388.27736: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882388.27745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.27818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882388.27836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882388.27853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882388.27972: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882388.29859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882388.29864: stdout chunk (state=3): >>><<< 11124 1726882388.29867: stderr chunk (state=3): >>><<< 11124 1726882388.29961: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882388.29967: _low_level_execute_command(): starting 11124 1726882388.29970: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218/AnsiballZ_package_facts.py && sleep 0' 11124 1726882388.30548: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882388.30568: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882388.30587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882388.30606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.30647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882388.30671: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882388.30689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.30707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882388.30720: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882388.30732: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882388.30744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882388.30761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882388.30779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.30794: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882388.30808: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882388.30822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.30901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882388.30926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882388.30942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882388.31082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882388.77030: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "e<<< 11124 1726882388.77049: stdout chunk (state=3): >>>poch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects"<<< 11124 1726882388.77058: stdout chunk (state=3): >>>: [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source"<<< 11124 1726882388.77062: stdout chunk (state=3): >>>: "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release<<< 11124 1726882388.77077: stdout chunk (state=3): >>>": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]<<< 11124 1726882388.77082: stdout chunk (state=3): >>>, "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.1<<< 11124 1726882388.77087: stdout chunk (state=3): >>>6.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202<<< 11124 1726882388.77089: stdout chunk (state=3): >>>", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "a<<< 11124 1726882388.77095: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "sour<<< 11124 1726882388.77156: stdout chunk (state=3): >>>ce": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch<<< 11124 1726882388.77170: stdout chunk (state=3): >>>", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11124 1726882388.78636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882388.78693: stderr chunk (state=3): >>><<< 11124 1726882388.78696: stdout chunk (state=3): >>><<< 11124 1726882388.78735: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "subscription-manager-rhsm-certificates": [{"name": "subscription-manager-rhsm-certificates", "version": "20220623", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dmidecode": [{"name": "dmidecode", "version": "3.6", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-iniparse": [{"name": "python3-iniparse", "version": "0.4", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-inotify": [{"name": "python3-inotify", "version": "0.9.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-decorator": [{"name": "python3-decorator", "version": "4.4.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-cloud-what": [{"name": "python3-cloud-what", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "45.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "virt-what": [{"name": "virt-what", "version": "1.25", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.44", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "45.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "usermode": [{"name": "usermode", "version": "1.114", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf-plugin-subscription-manager": [{"name": "libdnf-plugin-subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-librepo": [{"name": "python3-librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-subscription-manager-rhsm": [{"name": "python3-subscription-manager-rhsm", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "subscription-manager": [{"name": "subscription-manager", "version": "1.29.42", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "policycoreutils-python-utils": [{"name": "policycoreutils-python-utils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "rhc": [{"name": "rhc", "version": "0.2.4", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.gitbaf3e06.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "122.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "2.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "5.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882388.80828: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882388.80858: _low_level_execute_command(): starting 11124 1726882388.80871: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882388.1769168-12431-129227357210218/ > /dev/null 2>&1 && sleep 0' 11124 1726882388.81528: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882388.81542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882388.81562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882388.81584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.81627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882388.81639: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882388.81655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.81675: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882388.81686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882388.81697: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882388.81709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882388.81723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882388.81738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882388.81749: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882388.81765: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882388.81778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882388.81857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882388.81883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882388.81898: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882388.82021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882388.83837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882388.83920: stderr chunk (state=3): >>><<< 11124 1726882388.83930: stdout chunk (state=3): >>><<< 11124 1726882388.84269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882388.84272: handler run complete 11124 1726882388.84909: variable 'ansible_facts' from source: unknown 11124 1726882388.85455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882388.87765: variable 'ansible_facts' from source: unknown 11124 1726882388.88287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882388.89156: attempt loop complete, returning result 11124 1726882388.89181: _execute() done 11124 1726882388.89188: dumping result to json 11124 1726882388.89437: done dumping result, returning 11124 1726882388.89450: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0e448fcc-3ce9-8362-0f62-00000000049a] 11124 1726882388.89460: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000049a 11124 1726882388.91684: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000049a 11124 1726882388.91688: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882388.91892: no more pending results, returning what we have 11124 1726882388.91895: results queue empty 11124 1726882388.91896: checking for any_errors_fatal 11124 1726882388.91903: done checking for any_errors_fatal 11124 1726882388.91904: checking for max_fail_percentage 11124 1726882388.91906: done checking for max_fail_percentage 11124 1726882388.91907: checking to see if all hosts have failed and the running result is not ok 11124 1726882388.91908: done checking to see if all hosts have failed 11124 1726882388.91909: getting the remaining hosts for this loop 11124 1726882388.91910: done getting the remaining hosts for this loop 11124 1726882388.91913: getting the next task for host managed_node1 11124 1726882388.91921: done getting next task for host managed_node1 11124 1726882388.91925: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11124 1726882388.91930: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882388.91942: getting variables 11124 1726882388.91944: in VariableManager get_vars() 11124 1726882388.91986: Calling all_inventory to load vars for managed_node1 11124 1726882388.91989: Calling groups_inventory to load vars for managed_node1 11124 1726882388.91992: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882388.92004: Calling all_plugins_play to load vars for managed_node1 11124 1726882388.92006: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882388.92015: Calling groups_plugins_play to load vars for managed_node1 11124 1726882388.93608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882388.95993: done with get_vars() 11124 1726882388.96027: done getting variables 11124 1726882388.96127: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:33:08 -0400 (0:00:00.825) 0:00:29.204 ****** 11124 1726882388.96168: entering _queue_task() for managed_node1/debug 11124 1726882388.96995: worker is 1 (out of 1 available) 11124 1726882388.97009: exiting _queue_task() for managed_node1/debug 11124 1726882388.97021: done queuing things up, now waiting for results queue to drain 11124 1726882388.97023: waiting for pending results... 11124 1726882388.97477: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 11124 1726882388.97783: in run() - task 0e448fcc-3ce9-8362-0f62-00000000007e 11124 1726882388.97865: variable 'ansible_search_path' from source: unknown 11124 1726882388.97959: variable 'ansible_search_path' from source: unknown 11124 1726882388.98007: calling self._execute() 11124 1726882388.98239: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882388.98254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882388.98343: variable 'omit' from source: magic vars 11124 1726882388.99133: variable 'ansible_distribution_major_version' from source: facts 11124 1726882388.99177: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882388.99190: variable 'omit' from source: magic vars 11124 1726882388.99263: variable 'omit' from source: magic vars 11124 1726882388.99331: variable 'network_provider' from source: set_fact 11124 1726882388.99344: variable 'omit' from source: magic vars 11124 1726882388.99382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882388.99410: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882388.99426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882388.99438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882388.99447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882388.99477: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882388.99480: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882388.99483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882388.99552: Set connection var ansible_shell_executable to /bin/sh 11124 1726882388.99561: Set connection var ansible_shell_type to sh 11124 1726882388.99569: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882388.99574: Set connection var ansible_timeout to 10 11124 1726882388.99579: Set connection var ansible_pipelining to False 11124 1726882388.99581: Set connection var ansible_connection to ssh 11124 1726882388.99600: variable 'ansible_shell_executable' from source: unknown 11124 1726882388.99603: variable 'ansible_connection' from source: unknown 11124 1726882388.99606: variable 'ansible_module_compression' from source: unknown 11124 1726882388.99609: variable 'ansible_shell_type' from source: unknown 11124 1726882388.99611: variable 'ansible_shell_executable' from source: unknown 11124 1726882388.99614: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882388.99616: variable 'ansible_pipelining' from source: unknown 11124 1726882388.99618: variable 'ansible_timeout' from source: unknown 11124 1726882388.99621: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882388.99723: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882388.99733: variable 'omit' from source: magic vars 11124 1726882388.99738: starting attempt loop 11124 1726882388.99740: running the handler 11124 1726882388.99780: handler run complete 11124 1726882388.99791: attempt loop complete, returning result 11124 1726882388.99794: _execute() done 11124 1726882388.99796: dumping result to json 11124 1726882388.99799: done dumping result, returning 11124 1726882388.99805: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0e448fcc-3ce9-8362-0f62-00000000007e] 11124 1726882388.99814: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000007e 11124 1726882388.99901: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000007e 11124 1726882388.99904: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 11124 1726882388.99981: no more pending results, returning what we have 11124 1726882388.99984: results queue empty 11124 1726882388.99985: checking for any_errors_fatal 11124 1726882388.99995: done checking for any_errors_fatal 11124 1726882388.99996: checking for max_fail_percentage 11124 1726882388.99997: done checking for max_fail_percentage 11124 1726882388.99998: checking to see if all hosts have failed and the running result is not ok 11124 1726882388.99999: done checking to see if all hosts have failed 11124 1726882389.00000: getting the remaining hosts for this loop 11124 1726882389.00001: done getting the remaining hosts for this loop 11124 1726882389.00005: getting the next task for host managed_node1 11124 1726882389.00011: done getting next task for host managed_node1 11124 1726882389.00017: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11124 1726882389.00020: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.00030: getting variables 11124 1726882389.00031: in VariableManager get_vars() 11124 1726882389.00071: Calling all_inventory to load vars for managed_node1 11124 1726882389.00074: Calling groups_inventory to load vars for managed_node1 11124 1726882389.00076: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.00085: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.00088: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.00091: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.01183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.02579: done with get_vars() 11124 1726882389.02597: done getting variables 11124 1726882389.02641: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:33:09 -0400 (0:00:00.065) 0:00:29.269 ****** 11124 1726882389.02670: entering _queue_task() for managed_node1/fail 11124 1726882389.02904: worker is 1 (out of 1 available) 11124 1726882389.02917: exiting _queue_task() for managed_node1/fail 11124 1726882389.02930: done queuing things up, now waiting for results queue to drain 11124 1726882389.02932: waiting for pending results... 11124 1726882389.03115: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11124 1726882389.03220: in run() - task 0e448fcc-3ce9-8362-0f62-00000000007f 11124 1726882389.03232: variable 'ansible_search_path' from source: unknown 11124 1726882389.03237: variable 'ansible_search_path' from source: unknown 11124 1726882389.03272: calling self._execute() 11124 1726882389.03345: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.03350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.03360: variable 'omit' from source: magic vars 11124 1726882389.03633: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.03643: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.03732: variable 'network_state' from source: role '' defaults 11124 1726882389.03740: Evaluated conditional (network_state != {}): False 11124 1726882389.03743: when evaluation is False, skipping this task 11124 1726882389.03746: _execute() done 11124 1726882389.03748: dumping result to json 11124 1726882389.03750: done dumping result, returning 11124 1726882389.03760: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0e448fcc-3ce9-8362-0f62-00000000007f] 11124 1726882389.03767: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000007f 11124 1726882389.03853: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000007f 11124 1726882389.03856: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882389.03904: no more pending results, returning what we have 11124 1726882389.03907: results queue empty 11124 1726882389.03908: checking for any_errors_fatal 11124 1726882389.03914: done checking for any_errors_fatal 11124 1726882389.03915: checking for max_fail_percentage 11124 1726882389.03916: done checking for max_fail_percentage 11124 1726882389.03917: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.03918: done checking to see if all hosts have failed 11124 1726882389.03919: getting the remaining hosts for this loop 11124 1726882389.03920: done getting the remaining hosts for this loop 11124 1726882389.03923: getting the next task for host managed_node1 11124 1726882389.03930: done getting next task for host managed_node1 11124 1726882389.03934: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11124 1726882389.03938: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.03958: getting variables 11124 1726882389.03960: in VariableManager get_vars() 11124 1726882389.03997: Calling all_inventory to load vars for managed_node1 11124 1726882389.04000: Calling groups_inventory to load vars for managed_node1 11124 1726882389.04002: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.04012: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.04014: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.04017: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.05370: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.06678: done with get_vars() 11124 1726882389.06696: done getting variables 11124 1726882389.06739: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:33:09 -0400 (0:00:00.040) 0:00:29.310 ****** 11124 1726882389.06767: entering _queue_task() for managed_node1/fail 11124 1726882389.06998: worker is 1 (out of 1 available) 11124 1726882389.07011: exiting _queue_task() for managed_node1/fail 11124 1726882389.07024: done queuing things up, now waiting for results queue to drain 11124 1726882389.07026: waiting for pending results... 11124 1726882389.07217: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11124 1726882389.07321: in run() - task 0e448fcc-3ce9-8362-0f62-000000000080 11124 1726882389.07336: variable 'ansible_search_path' from source: unknown 11124 1726882389.07340: variable 'ansible_search_path' from source: unknown 11124 1726882389.07377: calling self._execute() 11124 1726882389.07460: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.07465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.07475: variable 'omit' from source: magic vars 11124 1726882389.07749: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.07762: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.07845: variable 'network_state' from source: role '' defaults 11124 1726882389.07856: Evaluated conditional (network_state != {}): False 11124 1726882389.07860: when evaluation is False, skipping this task 11124 1726882389.07864: _execute() done 11124 1726882389.07867: dumping result to json 11124 1726882389.07870: done dumping result, returning 11124 1726882389.07880: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0e448fcc-3ce9-8362-0f62-000000000080] 11124 1726882389.07883: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000080 11124 1726882389.08010: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000080 11124 1726882389.08013: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882389.08058: no more pending results, returning what we have 11124 1726882389.08065: results queue empty 11124 1726882389.08067: checking for any_errors_fatal 11124 1726882389.08098: done checking for any_errors_fatal 11124 1726882389.08099: checking for max_fail_percentage 11124 1726882389.08101: done checking for max_fail_percentage 11124 1726882389.08101: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.08108: done checking to see if all hosts have failed 11124 1726882389.08109: getting the remaining hosts for this loop 11124 1726882389.08111: done getting the remaining hosts for this loop 11124 1726882389.08114: getting the next task for host managed_node1 11124 1726882389.08119: done getting next task for host managed_node1 11124 1726882389.08123: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11124 1726882389.08126: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.08142: getting variables 11124 1726882389.08143: in VariableManager get_vars() 11124 1726882389.08210: Calling all_inventory to load vars for managed_node1 11124 1726882389.08219: Calling groups_inventory to load vars for managed_node1 11124 1726882389.08221: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.08228: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.08230: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.08231: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.09558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.11122: done with get_vars() 11124 1726882389.11144: done getting variables 11124 1726882389.11193: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:33:09 -0400 (0:00:00.044) 0:00:29.354 ****** 11124 1726882389.11220: entering _queue_task() for managed_node1/fail 11124 1726882389.11458: worker is 1 (out of 1 available) 11124 1726882389.11474: exiting _queue_task() for managed_node1/fail 11124 1726882389.11486: done queuing things up, now waiting for results queue to drain 11124 1726882389.11487: waiting for pending results... 11124 1726882389.11666: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11124 1726882389.11764: in run() - task 0e448fcc-3ce9-8362-0f62-000000000081 11124 1726882389.11776: variable 'ansible_search_path' from source: unknown 11124 1726882389.11779: variable 'ansible_search_path' from source: unknown 11124 1726882389.11808: calling self._execute() 11124 1726882389.11888: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.11891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.11900: variable 'omit' from source: magic vars 11124 1726882389.12172: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.12182: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.12305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882389.14395: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882389.14439: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882389.14468: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882389.14497: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882389.14517: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882389.14578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.14611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.14630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.14658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.14672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.14743: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.14757: Evaluated conditional (ansible_distribution_major_version | int > 9): False 11124 1726882389.14760: when evaluation is False, skipping this task 11124 1726882389.14765: _execute() done 11124 1726882389.14768: dumping result to json 11124 1726882389.14770: done dumping result, returning 11124 1726882389.14778: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0e448fcc-3ce9-8362-0f62-000000000081] 11124 1726882389.14782: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000081 11124 1726882389.14874: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000081 11124 1726882389.14877: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 11124 1726882389.14921: no more pending results, returning what we have 11124 1726882389.14924: results queue empty 11124 1726882389.14926: checking for any_errors_fatal 11124 1726882389.14933: done checking for any_errors_fatal 11124 1726882389.14934: checking for max_fail_percentage 11124 1726882389.14936: done checking for max_fail_percentage 11124 1726882389.14936: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.14937: done checking to see if all hosts have failed 11124 1726882389.14938: getting the remaining hosts for this loop 11124 1726882389.14939: done getting the remaining hosts for this loop 11124 1726882389.14943: getting the next task for host managed_node1 11124 1726882389.14950: done getting next task for host managed_node1 11124 1726882389.14954: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11124 1726882389.14958: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.14978: getting variables 11124 1726882389.14979: in VariableManager get_vars() 11124 1726882389.15018: Calling all_inventory to load vars for managed_node1 11124 1726882389.15021: Calling groups_inventory to load vars for managed_node1 11124 1726882389.15023: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.15032: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.15035: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.15037: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.15871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.16919: done with get_vars() 11124 1726882389.16935: done getting variables 11124 1726882389.16982: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:33:09 -0400 (0:00:00.057) 0:00:29.412 ****** 11124 1726882389.17006: entering _queue_task() for managed_node1/dnf 11124 1726882389.17252: worker is 1 (out of 1 available) 11124 1726882389.17266: exiting _queue_task() for managed_node1/dnf 11124 1726882389.17278: done queuing things up, now waiting for results queue to drain 11124 1726882389.17279: waiting for pending results... 11124 1726882389.17472: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11124 1726882389.17577: in run() - task 0e448fcc-3ce9-8362-0f62-000000000082 11124 1726882389.17588: variable 'ansible_search_path' from source: unknown 11124 1726882389.17591: variable 'ansible_search_path' from source: unknown 11124 1726882389.17624: calling self._execute() 11124 1726882389.17700: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.17703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.17712: variable 'omit' from source: magic vars 11124 1726882389.17994: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.18004: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.18141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882389.19738: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882389.19785: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882389.19811: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882389.19836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882389.19857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882389.19928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.19948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.19969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.19994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.20011: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.20091: variable 'ansible_distribution' from source: facts 11124 1726882389.20095: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.20106: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11124 1726882389.20187: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882389.20273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.20289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.20306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.20336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.20342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.20373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.20389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.20405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.20429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.20444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.20470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.20486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.20502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.20526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.20536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.20637: variable 'network_connections' from source: task vars 11124 1726882389.20645: variable 'port2_profile' from source: play vars 11124 1726882389.20694: variable 'port2_profile' from source: play vars 11124 1726882389.20702: variable 'port1_profile' from source: play vars 11124 1726882389.20743: variable 'port1_profile' from source: play vars 11124 1726882389.20749: variable 'controller_profile' from source: play vars 11124 1726882389.20796: variable 'controller_profile' from source: play vars 11124 1726882389.20843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882389.20955: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882389.20982: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882389.21006: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882389.21028: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882389.21058: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882389.21080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882389.21100: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.21117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882389.21156: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882389.21319: variable 'network_connections' from source: task vars 11124 1726882389.21324: variable 'port2_profile' from source: play vars 11124 1726882389.21365: variable 'port2_profile' from source: play vars 11124 1726882389.21372: variable 'port1_profile' from source: play vars 11124 1726882389.21412: variable 'port1_profile' from source: play vars 11124 1726882389.21425: variable 'controller_profile' from source: play vars 11124 1726882389.21465: variable 'controller_profile' from source: play vars 11124 1726882389.21483: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11124 1726882389.21486: when evaluation is False, skipping this task 11124 1726882389.21488: _execute() done 11124 1726882389.21491: dumping result to json 11124 1726882389.21493: done dumping result, returning 11124 1726882389.21501: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0e448fcc-3ce9-8362-0f62-000000000082] 11124 1726882389.21506: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000082 11124 1726882389.21603: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000082 11124 1726882389.21606: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11124 1726882389.21662: no more pending results, returning what we have 11124 1726882389.21667: results queue empty 11124 1726882389.21668: checking for any_errors_fatal 11124 1726882389.21673: done checking for any_errors_fatal 11124 1726882389.21674: checking for max_fail_percentage 11124 1726882389.21676: done checking for max_fail_percentage 11124 1726882389.21676: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.21678: done checking to see if all hosts have failed 11124 1726882389.21678: getting the remaining hosts for this loop 11124 1726882389.21680: done getting the remaining hosts for this loop 11124 1726882389.21683: getting the next task for host managed_node1 11124 1726882389.21690: done getting next task for host managed_node1 11124 1726882389.21694: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11124 1726882389.21698: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.21717: getting variables 11124 1726882389.21718: in VariableManager get_vars() 11124 1726882389.21766: Calling all_inventory to load vars for managed_node1 11124 1726882389.21769: Calling groups_inventory to load vars for managed_node1 11124 1726882389.21771: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.21780: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.21783: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.21786: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.22604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.23558: done with get_vars() 11124 1726882389.23576: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11124 1726882389.23630: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:33:09 -0400 (0:00:00.066) 0:00:29.479 ****** 11124 1726882389.23655: entering _queue_task() for managed_node1/yum 11124 1726882389.23899: worker is 1 (out of 1 available) 11124 1726882389.23912: exiting _queue_task() for managed_node1/yum 11124 1726882389.23933: done queuing things up, now waiting for results queue to drain 11124 1726882389.23935: waiting for pending results... 11124 1726882389.24140: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11124 1726882389.24236: in run() - task 0e448fcc-3ce9-8362-0f62-000000000083 11124 1726882389.24247: variable 'ansible_search_path' from source: unknown 11124 1726882389.24251: variable 'ansible_search_path' from source: unknown 11124 1726882389.24287: calling self._execute() 11124 1726882389.24359: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.24364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.24374: variable 'omit' from source: magic vars 11124 1726882389.24635: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.24644: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.24770: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882389.26857: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882389.26902: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882389.26928: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882389.26954: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882389.26980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882389.27036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.27312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.27332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.27361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.27375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.27444: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.27459: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11124 1726882389.27462: when evaluation is False, skipping this task 11124 1726882389.27466: _execute() done 11124 1726882389.27469: dumping result to json 11124 1726882389.27472: done dumping result, returning 11124 1726882389.27479: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0e448fcc-3ce9-8362-0f62-000000000083] 11124 1726882389.27484: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000083 11124 1726882389.27580: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000083 11124 1726882389.27583: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11124 1726882389.27628: no more pending results, returning what we have 11124 1726882389.27632: results queue empty 11124 1726882389.27633: checking for any_errors_fatal 11124 1726882389.27639: done checking for any_errors_fatal 11124 1726882389.27639: checking for max_fail_percentage 11124 1726882389.27641: done checking for max_fail_percentage 11124 1726882389.27642: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.27643: done checking to see if all hosts have failed 11124 1726882389.27644: getting the remaining hosts for this loop 11124 1726882389.27645: done getting the remaining hosts for this loop 11124 1726882389.27648: getting the next task for host managed_node1 11124 1726882389.27656: done getting next task for host managed_node1 11124 1726882389.27660: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11124 1726882389.27665: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.27685: getting variables 11124 1726882389.27686: in VariableManager get_vars() 11124 1726882389.27722: Calling all_inventory to load vars for managed_node1 11124 1726882389.27725: Calling groups_inventory to load vars for managed_node1 11124 1726882389.27727: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.27736: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.27739: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.27741: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.29597: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.31366: done with get_vars() 11124 1726882389.31393: done getting variables 11124 1726882389.31456: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:33:09 -0400 (0:00:00.078) 0:00:29.557 ****** 11124 1726882389.31495: entering _queue_task() for managed_node1/fail 11124 1726882389.31802: worker is 1 (out of 1 available) 11124 1726882389.31815: exiting _queue_task() for managed_node1/fail 11124 1726882389.31827: done queuing things up, now waiting for results queue to drain 11124 1726882389.31828: waiting for pending results... 11124 1726882389.32386: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11124 1726882389.32394: in run() - task 0e448fcc-3ce9-8362-0f62-000000000084 11124 1726882389.32397: variable 'ansible_search_path' from source: unknown 11124 1726882389.32400: variable 'ansible_search_path' from source: unknown 11124 1726882389.32404: calling self._execute() 11124 1726882389.32510: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.32520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.32534: variable 'omit' from source: magic vars 11124 1726882389.32919: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.32940: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.33060: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882389.33264: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882389.35693: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882389.35768: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882389.35812: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882389.35852: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882389.35895: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882389.35976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.36026: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.36059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.36114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.36133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.36184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.36236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.36302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.36362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.36445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.36582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.36608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.36635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.36799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.36817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.37161: variable 'network_connections' from source: task vars 11124 1726882389.37266: variable 'port2_profile' from source: play vars 11124 1726882389.37344: variable 'port2_profile' from source: play vars 11124 1726882389.37416: variable 'port1_profile' from source: play vars 11124 1726882389.37579: variable 'port1_profile' from source: play vars 11124 1726882389.37592: variable 'controller_profile' from source: play vars 11124 1726882389.37774: variable 'controller_profile' from source: play vars 11124 1726882389.37882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882389.38154: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882389.38304: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882389.38343: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882389.38421: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882389.38570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882389.38598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882389.38706: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.38741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882389.38861: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882389.39467: variable 'network_connections' from source: task vars 11124 1726882389.39577: variable 'port2_profile' from source: play vars 11124 1726882389.39645: variable 'port2_profile' from source: play vars 11124 1726882389.39706: variable 'port1_profile' from source: play vars 11124 1726882389.39872: variable 'port1_profile' from source: play vars 11124 1726882389.39931: variable 'controller_profile' from source: play vars 11124 1726882389.40067: variable 'controller_profile' from source: play vars 11124 1726882389.40162: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11124 1726882389.40183: when evaluation is False, skipping this task 11124 1726882389.40245: _execute() done 11124 1726882389.40258: dumping result to json 11124 1726882389.40269: done dumping result, returning 11124 1726882389.40282: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-8362-0f62-000000000084] 11124 1726882389.40292: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000084 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11124 1726882389.40458: no more pending results, returning what we have 11124 1726882389.40463: results queue empty 11124 1726882389.40466: checking for any_errors_fatal 11124 1726882389.40474: done checking for any_errors_fatal 11124 1726882389.40475: checking for max_fail_percentage 11124 1726882389.40476: done checking for max_fail_percentage 11124 1726882389.40477: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.40479: done checking to see if all hosts have failed 11124 1726882389.40479: getting the remaining hosts for this loop 11124 1726882389.40481: done getting the remaining hosts for this loop 11124 1726882389.40485: getting the next task for host managed_node1 11124 1726882389.40493: done getting next task for host managed_node1 11124 1726882389.40498: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11124 1726882389.40502: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.40523: getting variables 11124 1726882389.40526: in VariableManager get_vars() 11124 1726882389.40576: Calling all_inventory to load vars for managed_node1 11124 1726882389.40579: Calling groups_inventory to load vars for managed_node1 11124 1726882389.40581: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.40594: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.40597: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.40601: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.41998: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000084 11124 1726882389.42001: WORKER PROCESS EXITING 11124 1726882389.42727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.44630: done with get_vars() 11124 1726882389.44667: done getting variables 11124 1726882389.44726: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:33:09 -0400 (0:00:00.132) 0:00:29.690 ****** 11124 1726882389.44768: entering _queue_task() for managed_node1/package 11124 1726882389.45114: worker is 1 (out of 1 available) 11124 1726882389.45127: exiting _queue_task() for managed_node1/package 11124 1726882389.45140: done queuing things up, now waiting for results queue to drain 11124 1726882389.45141: waiting for pending results... 11124 1726882389.45484: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 11124 1726882389.45799: in run() - task 0e448fcc-3ce9-8362-0f62-000000000085 11124 1726882389.45822: variable 'ansible_search_path' from source: unknown 11124 1726882389.45830: variable 'ansible_search_path' from source: unknown 11124 1726882389.45885: calling self._execute() 11124 1726882389.45991: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.46007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.46022: variable 'omit' from source: magic vars 11124 1726882389.46588: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.46672: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.46999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882389.47312: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882389.47369: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882389.47410: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882389.47502: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882389.47620: variable 'network_packages' from source: role '' defaults 11124 1726882389.47743: variable '__network_provider_setup' from source: role '' defaults 11124 1726882389.47762: variable '__network_service_name_default_nm' from source: role '' defaults 11124 1726882389.47830: variable '__network_service_name_default_nm' from source: role '' defaults 11124 1726882389.47845: variable '__network_packages_default_nm' from source: role '' defaults 11124 1726882389.47915: variable '__network_packages_default_nm' from source: role '' defaults 11124 1726882389.48113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882389.51137: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882389.51230: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882389.51280: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882389.51319: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882389.51358: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882389.51444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.51489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.51519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.51576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.51597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.51647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.51684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.51713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.51765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.51791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.52040: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11124 1726882389.52160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.52189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.52221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.52270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.52289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.52384: variable 'ansible_python' from source: facts 11124 1726882389.52413: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11124 1726882389.52505: variable '__network_wpa_supplicant_required' from source: role '' defaults 11124 1726882389.52591: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11124 1726882389.52718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.52746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.52785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.52828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.52847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.52905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.52943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.52976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.53017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.53036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.53194: variable 'network_connections' from source: task vars 11124 1726882389.53206: variable 'port2_profile' from source: play vars 11124 1726882389.53316: variable 'port2_profile' from source: play vars 11124 1726882389.53330: variable 'port1_profile' from source: play vars 11124 1726882389.53432: variable 'port1_profile' from source: play vars 11124 1726882389.53454: variable 'controller_profile' from source: play vars 11124 1726882389.53562: variable 'controller_profile' from source: play vars 11124 1726882389.53649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882389.53690: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882389.53728: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.53776: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882389.53830: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882389.54096: variable 'network_connections' from source: task vars 11124 1726882389.54106: variable 'port2_profile' from source: play vars 11124 1726882389.54203: variable 'port2_profile' from source: play vars 11124 1726882389.54217: variable 'port1_profile' from source: play vars 11124 1726882389.54315: variable 'port1_profile' from source: play vars 11124 1726882389.54329: variable 'controller_profile' from source: play vars 11124 1726882389.54425: variable 'controller_profile' from source: play vars 11124 1726882389.54462: variable '__network_packages_default_wireless' from source: role '' defaults 11124 1726882389.54544: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882389.54863: variable 'network_connections' from source: task vars 11124 1726882389.54874: variable 'port2_profile' from source: play vars 11124 1726882389.54944: variable 'port2_profile' from source: play vars 11124 1726882389.54959: variable 'port1_profile' from source: play vars 11124 1726882389.55024: variable 'port1_profile' from source: play vars 11124 1726882389.55035: variable 'controller_profile' from source: play vars 11124 1726882389.55128: variable 'controller_profile' from source: play vars 11124 1726882389.55165: variable '__network_packages_default_team' from source: role '' defaults 11124 1726882389.55277: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882389.56594: variable 'network_connections' from source: task vars 11124 1726882389.56605: variable 'port2_profile' from source: play vars 11124 1726882389.56694: variable 'port2_profile' from source: play vars 11124 1726882389.56720: variable 'port1_profile' from source: play vars 11124 1726882389.56812: variable 'port1_profile' from source: play vars 11124 1726882389.56824: variable 'controller_profile' from source: play vars 11124 1726882389.56897: variable 'controller_profile' from source: play vars 11124 1726882389.56986: variable '__network_service_name_default_initscripts' from source: role '' defaults 11124 1726882389.57124: variable '__network_service_name_default_initscripts' from source: role '' defaults 11124 1726882389.57135: variable '__network_packages_default_initscripts' from source: role '' defaults 11124 1726882389.57204: variable '__network_packages_default_initscripts' from source: role '' defaults 11124 1726882389.57456: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11124 1726882389.57998: variable 'network_connections' from source: task vars 11124 1726882389.58009: variable 'port2_profile' from source: play vars 11124 1726882389.58081: variable 'port2_profile' from source: play vars 11124 1726882389.58093: variable 'port1_profile' from source: play vars 11124 1726882389.58155: variable 'port1_profile' from source: play vars 11124 1726882389.58171: variable 'controller_profile' from source: play vars 11124 1726882389.58230: variable 'controller_profile' from source: play vars 11124 1726882389.58242: variable 'ansible_distribution' from source: facts 11124 1726882389.58249: variable '__network_rh_distros' from source: role '' defaults 11124 1726882389.58266: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.58287: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11124 1726882389.58458: variable 'ansible_distribution' from source: facts 11124 1726882389.58469: variable '__network_rh_distros' from source: role '' defaults 11124 1726882389.58478: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.58498: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11124 1726882389.58671: variable 'ansible_distribution' from source: facts 11124 1726882389.58680: variable '__network_rh_distros' from source: role '' defaults 11124 1726882389.58689: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.58733: variable 'network_provider' from source: set_fact 11124 1726882389.58757: variable 'ansible_facts' from source: unknown 11124 1726882389.59558: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11124 1726882389.59569: when evaluation is False, skipping this task 11124 1726882389.59576: _execute() done 11124 1726882389.59583: dumping result to json 11124 1726882389.59589: done dumping result, returning 11124 1726882389.59610: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0e448fcc-3ce9-8362-0f62-000000000085] 11124 1726882389.59619: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000085 skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11124 1726882389.59780: no more pending results, returning what we have 11124 1726882389.59784: results queue empty 11124 1726882389.59785: checking for any_errors_fatal 11124 1726882389.59794: done checking for any_errors_fatal 11124 1726882389.59795: checking for max_fail_percentage 11124 1726882389.59797: done checking for max_fail_percentage 11124 1726882389.59798: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.59799: done checking to see if all hosts have failed 11124 1726882389.59799: getting the remaining hosts for this loop 11124 1726882389.59801: done getting the remaining hosts for this loop 11124 1726882389.59804: getting the next task for host managed_node1 11124 1726882389.59812: done getting next task for host managed_node1 11124 1726882389.59820: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11124 1726882389.59825: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.59845: getting variables 11124 1726882389.59847: in VariableManager get_vars() 11124 1726882389.59895: Calling all_inventory to load vars for managed_node1 11124 1726882389.59898: Calling groups_inventory to load vars for managed_node1 11124 1726882389.59901: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.59912: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.59916: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.59918: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.60883: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000085 11124 1726882389.60886: WORKER PROCESS EXITING 11124 1726882389.62087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.64062: done with get_vars() 11124 1726882389.64089: done getting variables 11124 1726882389.64156: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:33:09 -0400 (0:00:00.194) 0:00:29.884 ****** 11124 1726882389.64194: entering _queue_task() for managed_node1/package 11124 1726882389.64540: worker is 1 (out of 1 available) 11124 1726882389.64559: exiting _queue_task() for managed_node1/package 11124 1726882389.64574: done queuing things up, now waiting for results queue to drain 11124 1726882389.64576: waiting for pending results... 11124 1726882389.64876: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11124 1726882389.65037: in run() - task 0e448fcc-3ce9-8362-0f62-000000000086 11124 1726882389.65065: variable 'ansible_search_path' from source: unknown 11124 1726882389.65074: variable 'ansible_search_path' from source: unknown 11124 1726882389.65121: calling self._execute() 11124 1726882389.65231: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.65244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.65261: variable 'omit' from source: magic vars 11124 1726882389.65662: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.65685: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.65812: variable 'network_state' from source: role '' defaults 11124 1726882389.65826: Evaluated conditional (network_state != {}): False 11124 1726882389.65832: when evaluation is False, skipping this task 11124 1726882389.65838: _execute() done 11124 1726882389.65845: dumping result to json 11124 1726882389.65854: done dumping result, returning 11124 1726882389.65871: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0e448fcc-3ce9-8362-0f62-000000000086] 11124 1726882389.65883: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000086 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882389.66039: no more pending results, returning what we have 11124 1726882389.66043: results queue empty 11124 1726882389.66044: checking for any_errors_fatal 11124 1726882389.66055: done checking for any_errors_fatal 11124 1726882389.66056: checking for max_fail_percentage 11124 1726882389.66057: done checking for max_fail_percentage 11124 1726882389.66059: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.66060: done checking to see if all hosts have failed 11124 1726882389.66061: getting the remaining hosts for this loop 11124 1726882389.66062: done getting the remaining hosts for this loop 11124 1726882389.66067: getting the next task for host managed_node1 11124 1726882389.66075: done getting next task for host managed_node1 11124 1726882389.66079: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11124 1726882389.66085: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.66108: getting variables 11124 1726882389.66110: in VariableManager get_vars() 11124 1726882389.66154: Calling all_inventory to load vars for managed_node1 11124 1726882389.66157: Calling groups_inventory to load vars for managed_node1 11124 1726882389.66159: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.66174: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.66178: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.66181: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.67203: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000086 11124 1726882389.67206: WORKER PROCESS EXITING 11124 1726882389.67989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.69961: done with get_vars() 11124 1726882389.69987: done getting variables 11124 1726882389.70057: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:33:09 -0400 (0:00:00.059) 0:00:29.943 ****** 11124 1726882389.70098: entering _queue_task() for managed_node1/package 11124 1726882389.70466: worker is 1 (out of 1 available) 11124 1726882389.70479: exiting _queue_task() for managed_node1/package 11124 1726882389.70495: done queuing things up, now waiting for results queue to drain 11124 1726882389.70497: waiting for pending results... 11124 1726882389.70799: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11124 1726882389.70953: in run() - task 0e448fcc-3ce9-8362-0f62-000000000087 11124 1726882389.70975: variable 'ansible_search_path' from source: unknown 11124 1726882389.70982: variable 'ansible_search_path' from source: unknown 11124 1726882389.71022: calling self._execute() 11124 1726882389.71130: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.71145: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.71169: variable 'omit' from source: magic vars 11124 1726882389.71561: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.71585: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.71713: variable 'network_state' from source: role '' defaults 11124 1726882389.71727: Evaluated conditional (network_state != {}): False 11124 1726882389.71733: when evaluation is False, skipping this task 11124 1726882389.71740: _execute() done 11124 1726882389.71746: dumping result to json 11124 1726882389.71756: done dumping result, returning 11124 1726882389.71768: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0e448fcc-3ce9-8362-0f62-000000000087] 11124 1726882389.71778: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000087 11124 1726882389.71902: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000087 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882389.71957: no more pending results, returning what we have 11124 1726882389.71961: results queue empty 11124 1726882389.71962: checking for any_errors_fatal 11124 1726882389.71970: done checking for any_errors_fatal 11124 1726882389.71971: checking for max_fail_percentage 11124 1726882389.71972: done checking for max_fail_percentage 11124 1726882389.71973: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.71975: done checking to see if all hosts have failed 11124 1726882389.71975: getting the remaining hosts for this loop 11124 1726882389.71977: done getting the remaining hosts for this loop 11124 1726882389.71981: getting the next task for host managed_node1 11124 1726882389.71989: done getting next task for host managed_node1 11124 1726882389.71992: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11124 1726882389.71998: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.72021: getting variables 11124 1726882389.72024: in VariableManager get_vars() 11124 1726882389.72069: Calling all_inventory to load vars for managed_node1 11124 1726882389.72073: Calling groups_inventory to load vars for managed_node1 11124 1726882389.72075: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.72089: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.72092: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.72095: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.73103: WORKER PROCESS EXITING 11124 1726882389.73891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.75684: done with get_vars() 11124 1726882389.75714: done getting variables 11124 1726882389.75782: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:33:09 -0400 (0:00:00.057) 0:00:30.000 ****** 11124 1726882389.75819: entering _queue_task() for managed_node1/service 11124 1726882389.76177: worker is 1 (out of 1 available) 11124 1726882389.76189: exiting _queue_task() for managed_node1/service 11124 1726882389.76201: done queuing things up, now waiting for results queue to drain 11124 1726882389.76203: waiting for pending results... 11124 1726882389.76553: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11124 1726882389.76712: in run() - task 0e448fcc-3ce9-8362-0f62-000000000088 11124 1726882389.76733: variable 'ansible_search_path' from source: unknown 11124 1726882389.76741: variable 'ansible_search_path' from source: unknown 11124 1726882389.76789: calling self._execute() 11124 1726882389.76897: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.76909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.76922: variable 'omit' from source: magic vars 11124 1726882389.77342: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.77365: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.77491: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882389.77708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882389.80232: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882389.80319: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882389.80366: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882389.80416: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882389.80454: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882389.80549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.80945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.80987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.81033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.81063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.81119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.81153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.81190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.81234: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.81260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.81312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.81340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.81379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.81428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.81447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.81652: variable 'network_connections' from source: task vars 11124 1726882389.81673: variable 'port2_profile' from source: play vars 11124 1726882389.81754: variable 'port2_profile' from source: play vars 11124 1726882389.81771: variable 'port1_profile' from source: play vars 11124 1726882389.81841: variable 'port1_profile' from source: play vars 11124 1726882389.81855: variable 'controller_profile' from source: play vars 11124 1726882389.81923: variable 'controller_profile' from source: play vars 11124 1726882389.82683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882389.83040: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882389.83120: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882389.83216: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882389.83324: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882389.83373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882389.83429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882389.83541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.83576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882389.83745: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882389.84134: variable 'network_connections' from source: task vars 11124 1726882389.84144: variable 'port2_profile' from source: play vars 11124 1726882389.84219: variable 'port2_profile' from source: play vars 11124 1726882389.84234: variable 'port1_profile' from source: play vars 11124 1726882389.84306: variable 'port1_profile' from source: play vars 11124 1726882389.84318: variable 'controller_profile' from source: play vars 11124 1726882389.84392: variable 'controller_profile' from source: play vars 11124 1726882389.84424: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11124 1726882389.84446: when evaluation is False, skipping this task 11124 1726882389.84457: _execute() done 11124 1726882389.84468: dumping result to json 11124 1726882389.84475: done dumping result, returning 11124 1726882389.84487: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0e448fcc-3ce9-8362-0f62-000000000088] 11124 1726882389.84501: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000088 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11124 1726882389.84667: no more pending results, returning what we have 11124 1726882389.84671: results queue empty 11124 1726882389.84672: checking for any_errors_fatal 11124 1726882389.84678: done checking for any_errors_fatal 11124 1726882389.84678: checking for max_fail_percentage 11124 1726882389.84681: done checking for max_fail_percentage 11124 1726882389.84682: checking to see if all hosts have failed and the running result is not ok 11124 1726882389.84683: done checking to see if all hosts have failed 11124 1726882389.84684: getting the remaining hosts for this loop 11124 1726882389.84685: done getting the remaining hosts for this loop 11124 1726882389.84689: getting the next task for host managed_node1 11124 1726882389.84696: done getting next task for host managed_node1 11124 1726882389.84700: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11124 1726882389.84705: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882389.84725: getting variables 11124 1726882389.84727: in VariableManager get_vars() 11124 1726882389.84773: Calling all_inventory to load vars for managed_node1 11124 1726882389.84776: Calling groups_inventory to load vars for managed_node1 11124 1726882389.84779: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882389.84790: Calling all_plugins_play to load vars for managed_node1 11124 1726882389.84793: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882389.84796: Calling groups_plugins_play to load vars for managed_node1 11124 1726882389.85808: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000088 11124 1726882389.85812: WORKER PROCESS EXITING 11124 1726882389.86846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882389.89576: done with get_vars() 11124 1726882389.89613: done getting variables 11124 1726882389.89682: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:33:09 -0400 (0:00:00.138) 0:00:30.139 ****** 11124 1726882389.89719: entering _queue_task() for managed_node1/service 11124 1726882389.90084: worker is 1 (out of 1 available) 11124 1726882389.90098: exiting _queue_task() for managed_node1/service 11124 1726882389.90110: done queuing things up, now waiting for results queue to drain 11124 1726882389.90112: waiting for pending results... 11124 1726882389.90425: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11124 1726882389.90603: in run() - task 0e448fcc-3ce9-8362-0f62-000000000089 11124 1726882389.90631: variable 'ansible_search_path' from source: unknown 11124 1726882389.90639: variable 'ansible_search_path' from source: unknown 11124 1726882389.90688: calling self._execute() 11124 1726882389.90797: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882389.90808: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882389.90821: variable 'omit' from source: magic vars 11124 1726882389.91221: variable 'ansible_distribution_major_version' from source: facts 11124 1726882389.91239: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882389.91423: variable 'network_provider' from source: set_fact 11124 1726882389.91434: variable 'network_state' from source: role '' defaults 11124 1726882389.91449: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11124 1726882389.91466: variable 'omit' from source: magic vars 11124 1726882389.91544: variable 'omit' from source: magic vars 11124 1726882389.91584: variable 'network_service_name' from source: role '' defaults 11124 1726882389.91665: variable 'network_service_name' from source: role '' defaults 11124 1726882389.91787: variable '__network_provider_setup' from source: role '' defaults 11124 1726882389.91800: variable '__network_service_name_default_nm' from source: role '' defaults 11124 1726882389.91878: variable '__network_service_name_default_nm' from source: role '' defaults 11124 1726882389.91892: variable '__network_packages_default_nm' from source: role '' defaults 11124 1726882389.91969: variable '__network_packages_default_nm' from source: role '' defaults 11124 1726882389.92216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882389.95233: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882389.95305: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882389.95342: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882389.95392: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882389.95422: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882389.95503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.95536: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.95562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.95605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.95619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.95668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.95691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.95716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.95759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.95775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.96015: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11124 1726882389.96135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.96158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.96189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.96229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.96244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.96335: variable 'ansible_python' from source: facts 11124 1726882389.96357: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11124 1726882389.96442: variable '__network_wpa_supplicant_required' from source: role '' defaults 11124 1726882389.96523: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11124 1726882389.96735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.96894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.96920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.96958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.96974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.97024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882389.97046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882389.97075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.97118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882389.97131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882389.97293: variable 'network_connections' from source: task vars 11124 1726882389.97296: variable 'port2_profile' from source: play vars 11124 1726882389.97374: variable 'port2_profile' from source: play vars 11124 1726882389.97394: variable 'port1_profile' from source: play vars 11124 1726882389.97459: variable 'port1_profile' from source: play vars 11124 1726882389.97472: variable 'controller_profile' from source: play vars 11124 1726882389.97546: variable 'controller_profile' from source: play vars 11124 1726882389.97657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882389.97837: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882389.97893: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882389.97935: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882389.97991: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882389.98055: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882389.98083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882389.98115: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882389.98147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882389.98200: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882389.98474: variable 'network_connections' from source: task vars 11124 1726882389.98480: variable 'port2_profile' from source: play vars 11124 1726882389.98556: variable 'port2_profile' from source: play vars 11124 1726882389.98600: variable 'port1_profile' from source: play vars 11124 1726882389.98759: variable 'port1_profile' from source: play vars 11124 1726882389.98775: variable 'controller_profile' from source: play vars 11124 1726882389.98858: variable 'controller_profile' from source: play vars 11124 1726882389.98898: variable '__network_packages_default_wireless' from source: role '' defaults 11124 1726882389.98981: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882389.99319: variable 'network_connections' from source: task vars 11124 1726882389.99322: variable 'port2_profile' from source: play vars 11124 1726882389.99343: variable 'port2_profile' from source: play vars 11124 1726882389.99353: variable 'port1_profile' from source: play vars 11124 1726882389.99423: variable 'port1_profile' from source: play vars 11124 1726882389.99431: variable 'controller_profile' from source: play vars 11124 1726882389.99499: variable 'controller_profile' from source: play vars 11124 1726882389.99523: variable '__network_packages_default_team' from source: role '' defaults 11124 1726882389.99734: variable '__network_team_connections_defined' from source: role '' defaults 11124 1726882390.00128: variable 'network_connections' from source: task vars 11124 1726882390.00138: variable 'port2_profile' from source: play vars 11124 1726882390.00205: variable 'port2_profile' from source: play vars 11124 1726882390.00215: variable 'port1_profile' from source: play vars 11124 1726882390.00298: variable 'port1_profile' from source: play vars 11124 1726882390.00303: variable 'controller_profile' from source: play vars 11124 1726882390.00402: variable 'controller_profile' from source: play vars 11124 1726882390.00493: variable '__network_service_name_default_initscripts' from source: role '' defaults 11124 1726882390.00554: variable '__network_service_name_default_initscripts' from source: role '' defaults 11124 1726882390.00557: variable '__network_packages_default_initscripts' from source: role '' defaults 11124 1726882390.00623: variable '__network_packages_default_initscripts' from source: role '' defaults 11124 1726882390.00975: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11124 1726882390.01424: variable 'network_connections' from source: task vars 11124 1726882390.01428: variable 'port2_profile' from source: play vars 11124 1726882390.01499: variable 'port2_profile' from source: play vars 11124 1726882390.01507: variable 'port1_profile' from source: play vars 11124 1726882390.01618: variable 'port1_profile' from source: play vars 11124 1726882390.01622: variable 'controller_profile' from source: play vars 11124 1726882390.01666: variable 'controller_profile' from source: play vars 11124 1726882390.01675: variable 'ansible_distribution' from source: facts 11124 1726882390.01678: variable '__network_rh_distros' from source: role '' defaults 11124 1726882390.01685: variable 'ansible_distribution_major_version' from source: facts 11124 1726882390.01700: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11124 1726882390.01995: variable 'ansible_distribution' from source: facts 11124 1726882390.01998: variable '__network_rh_distros' from source: role '' defaults 11124 1726882390.02005: variable 'ansible_distribution_major_version' from source: facts 11124 1726882390.02017: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11124 1726882390.02539: variable 'ansible_distribution' from source: facts 11124 1726882390.02543: variable '__network_rh_distros' from source: role '' defaults 11124 1726882390.02548: variable 'ansible_distribution_major_version' from source: facts 11124 1726882390.02594: variable 'network_provider' from source: set_fact 11124 1726882390.02619: variable 'omit' from source: magic vars 11124 1726882390.02648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882390.02678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882390.02702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882390.02719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882390.02729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882390.02759: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882390.02763: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882390.02768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882390.02871: Set connection var ansible_shell_executable to /bin/sh 11124 1726882390.02879: Set connection var ansible_shell_type to sh 11124 1726882390.02887: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882390.02893: Set connection var ansible_timeout to 10 11124 1726882390.02898: Set connection var ansible_pipelining to False 11124 1726882390.02901: Set connection var ansible_connection to ssh 11124 1726882390.02932: variable 'ansible_shell_executable' from source: unknown 11124 1726882390.02935: variable 'ansible_connection' from source: unknown 11124 1726882390.02937: variable 'ansible_module_compression' from source: unknown 11124 1726882390.02940: variable 'ansible_shell_type' from source: unknown 11124 1726882390.02942: variable 'ansible_shell_executable' from source: unknown 11124 1726882390.02944: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882390.02949: variable 'ansible_pipelining' from source: unknown 11124 1726882390.02954: variable 'ansible_timeout' from source: unknown 11124 1726882390.02956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882390.03070: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882390.03079: variable 'omit' from source: magic vars 11124 1726882390.03084: starting attempt loop 11124 1726882390.03087: running the handler 11124 1726882390.03174: variable 'ansible_facts' from source: unknown 11124 1726882390.05293: _low_level_execute_command(): starting 11124 1726882390.05297: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882390.05930: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882390.05935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882390.05967: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.05971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882390.05974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.06014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882390.06020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882390.06031: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882390.06142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882390.07816: stdout chunk (state=3): >>>/root <<< 11124 1726882390.07917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882390.08010: stderr chunk (state=3): >>><<< 11124 1726882390.08025: stdout chunk (state=3): >>><<< 11124 1726882390.08148: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882390.08154: _low_level_execute_command(): starting 11124 1726882390.08158: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363 `" && echo ansible-tmp-1726882390.0806966-12496-63784833673363="` echo /root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363 `" ) && sleep 0' 11124 1726882390.08858: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882390.08877: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882390.08892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882390.08910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882390.08954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882390.09079: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882390.09094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.09110: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882390.09121: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882390.09132: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882390.09142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882390.09158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882390.09177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882390.09189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882390.09200: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882390.09213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.09295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882390.09727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882390.09913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882390.11787: stdout chunk (state=3): >>>ansible-tmp-1726882390.0806966-12496-63784833673363=/root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363 <<< 11124 1726882390.11889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882390.12577: stderr chunk (state=3): >>><<< 11124 1726882390.12580: stdout chunk (state=3): >>><<< 11124 1726882390.12584: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882390.0806966-12496-63784833673363=/root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882390.12592: variable 'ansible_module_compression' from source: unknown 11124 1726882390.12594: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11124 1726882390.12596: variable 'ansible_facts' from source: unknown 11124 1726882390.12598: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363/AnsiballZ_systemd.py 11124 1726882390.12744: Sending initial data 11124 1726882390.12761: Sent initial data (155 bytes) 11124 1726882390.14168: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882390.14193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882390.14209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882390.14241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882390.14331: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882390.14342: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882390.14359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.14393: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882390.14419: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882390.14426: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882390.14431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882390.14441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882390.14446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882390.14455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882390.14458: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882390.14466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.14536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882390.14565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882390.14568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882390.14661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882390.16406: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882390.16496: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882390.16596: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp8ys7k_uk /root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363/AnsiballZ_systemd.py <<< 11124 1726882390.16685: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882390.18833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882390.18960: stderr chunk (state=3): >>><<< 11124 1726882390.18965: stdout chunk (state=3): >>><<< 11124 1726882390.18975: done transferring module to remote 11124 1726882390.18985: _low_level_execute_command(): starting 11124 1726882390.18989: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363/ /root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363/AnsiballZ_systemd.py && sleep 0' 11124 1726882390.19454: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882390.19458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882390.19495: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882390.19498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882390.19502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882390.19505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.19566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882390.19569: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882390.19571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882390.19662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882390.21434: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882390.21494: stderr chunk (state=3): >>><<< 11124 1726882390.21497: stdout chunk (state=3): >>><<< 11124 1726882390.21511: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882390.21514: _low_level_execute_command(): starting 11124 1726882390.21518: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363/AnsiballZ_systemd.py && sleep 0' 11124 1726882390.21985: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882390.21999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882390.22026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.22038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.22094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882390.22113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882390.22220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882390.47372: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16076800", "MemoryAvailable": "infinity", "CPUUsageNSec": "538446000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0"<<< 11124 1726882390.47413: stdout chunk (state=3): >>>, "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetu<<< 11124 1726882390.47423: stdout chunk (state=3): >>>al": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11124 1726882390.49073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882390.49077: stderr chunk (state=3): >>><<< 11124 1726882390.49079: stdout chunk (state=3): >>><<< 11124 1726882390.49098: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "618", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ExecMainStartTimestampMonotonic": "27221076", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "618", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2455", "MemoryCurrent": "16076800", "MemoryAvailable": "infinity", "CPUUsageNSec": "538446000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target NetworkManager-wait-online.service network.target multi-user.target network.service cloud-init.service", "After": "network-pre.target system.slice cloud-init-local.service sysinit.target systemd-journald.socket basic.target dbus-broker.service dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:32:48 EDT", "StateChangeTimestampMonotonic": "526071006", "InactiveExitTimestamp": "Fri 2024-09-20 21:24:29 EDT", "InactiveExitTimestampMonotonic": "27221264", "ActiveEnterTimestamp": "Fri 2024-09-20 21:24:30 EDT", "ActiveEnterTimestampMonotonic": "28518220", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:24:29 EDT", "ConditionTimestampMonotonic": "27216465", "AssertTimestamp": "Fri 2024-09-20 21:24:29 EDT", "AssertTimestampMonotonic": "27216468", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac59cab3098f415297681de935e089f5", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882390.49275: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882390.49293: _low_level_execute_command(): starting 11124 1726882390.49298: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882390.0806966-12496-63784833673363/ > /dev/null 2>&1 && sleep 0' 11124 1726882390.51037: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882390.51041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882390.51096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.51100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882390.51118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882390.51121: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882390.51209: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882390.51228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882390.51371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882390.53498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882390.53931: stderr chunk (state=3): >>><<< 11124 1726882390.53936: stdout chunk (state=3): >>><<< 11124 1726882390.53970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882390.53976: handler run complete 11124 1726882390.54031: attempt loop complete, returning result 11124 1726882390.54034: _execute() done 11124 1726882390.54037: dumping result to json 11124 1726882390.54054: done dumping result, returning 11124 1726882390.54061: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0e448fcc-3ce9-8362-0f62-000000000089] 11124 1726882390.54067: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000089 11124 1726882390.54291: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000089 11124 1726882390.54294: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882390.54360: no more pending results, returning what we have 11124 1726882390.54376: results queue empty 11124 1726882390.54379: checking for any_errors_fatal 11124 1726882390.54384: done checking for any_errors_fatal 11124 1726882390.54385: checking for max_fail_percentage 11124 1726882390.54387: done checking for max_fail_percentage 11124 1726882390.54387: checking to see if all hosts have failed and the running result is not ok 11124 1726882390.54388: done checking to see if all hosts have failed 11124 1726882390.54389: getting the remaining hosts for this loop 11124 1726882390.54390: done getting the remaining hosts for this loop 11124 1726882390.54394: getting the next task for host managed_node1 11124 1726882390.54400: done getting next task for host managed_node1 11124 1726882390.54404: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11124 1726882390.54408: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882390.54420: getting variables 11124 1726882390.54422: in VariableManager get_vars() 11124 1726882390.54460: Calling all_inventory to load vars for managed_node1 11124 1726882390.54467: Calling groups_inventory to load vars for managed_node1 11124 1726882390.54489: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882390.54498: Calling all_plugins_play to load vars for managed_node1 11124 1726882390.54501: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882390.54503: Calling groups_plugins_play to load vars for managed_node1 11124 1726882390.56584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882390.58496: done with get_vars() 11124 1726882390.58531: done getting variables 11124 1726882390.58604: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:33:10 -0400 (0:00:00.689) 0:00:30.828 ****** 11124 1726882390.58648: entering _queue_task() for managed_node1/service 11124 1726882390.59030: worker is 1 (out of 1 available) 11124 1726882390.59056: exiting _queue_task() for managed_node1/service 11124 1726882390.59071: done queuing things up, now waiting for results queue to drain 11124 1726882390.59073: waiting for pending results... 11124 1726882390.59418: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11124 1726882390.59581: in run() - task 0e448fcc-3ce9-8362-0f62-00000000008a 11124 1726882390.59611: variable 'ansible_search_path' from source: unknown 11124 1726882390.59620: variable 'ansible_search_path' from source: unknown 11124 1726882390.59670: calling self._execute() 11124 1726882390.59790: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882390.59802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882390.59820: variable 'omit' from source: magic vars 11124 1726882390.60249: variable 'ansible_distribution_major_version' from source: facts 11124 1726882390.60276: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882390.60419: variable 'network_provider' from source: set_fact 11124 1726882390.60429: Evaluated conditional (network_provider == "nm"): True 11124 1726882390.60535: variable '__network_wpa_supplicant_required' from source: role '' defaults 11124 1726882390.61264: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11124 1726882390.61476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882390.70078: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882390.70158: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882390.70192: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882390.70236: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882390.70270: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882390.70352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882390.70379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882390.70412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882390.70472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882390.70487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882390.70544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882390.70568: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882390.70592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882390.70630: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882390.70644: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882390.70712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882390.70770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882390.70793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882390.70830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882390.70843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882390.70980: variable 'network_connections' from source: task vars 11124 1726882390.70990: variable 'port2_profile' from source: play vars 11124 1726882390.71095: variable 'port2_profile' from source: play vars 11124 1726882390.71103: variable 'port1_profile' from source: play vars 11124 1726882390.71162: variable 'port1_profile' from source: play vars 11124 1726882390.71171: variable 'controller_profile' from source: play vars 11124 1726882390.71229: variable 'controller_profile' from source: play vars 11124 1726882390.71295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11124 1726882390.71446: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11124 1726882390.71483: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11124 1726882390.71517: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11124 1726882390.71555: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11124 1726882390.71595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11124 1726882390.71615: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11124 1726882390.71639: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882390.71665: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11124 1726882390.71703: variable '__network_wireless_connections_defined' from source: role '' defaults 11124 1726882390.71926: variable 'network_connections' from source: task vars 11124 1726882390.71929: variable 'port2_profile' from source: play vars 11124 1726882390.71989: variable 'port2_profile' from source: play vars 11124 1726882390.71997: variable 'port1_profile' from source: play vars 11124 1726882390.72078: variable 'port1_profile' from source: play vars 11124 1726882390.72088: variable 'controller_profile' from source: play vars 11124 1726882390.72147: variable 'controller_profile' from source: play vars 11124 1726882390.72221: Evaluated conditional (__network_wpa_supplicant_required): False 11124 1726882390.72224: when evaluation is False, skipping this task 11124 1726882390.72229: _execute() done 11124 1726882390.72231: dumping result to json 11124 1726882390.72233: done dumping result, returning 11124 1726882390.72242: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0e448fcc-3ce9-8362-0f62-00000000008a] 11124 1726882390.72244: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008a 11124 1726882390.72354: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008a 11124 1726882390.72357: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11124 1726882390.72406: no more pending results, returning what we have 11124 1726882390.72409: results queue empty 11124 1726882390.72410: checking for any_errors_fatal 11124 1726882390.72425: done checking for any_errors_fatal 11124 1726882390.72426: checking for max_fail_percentage 11124 1726882390.72427: done checking for max_fail_percentage 11124 1726882390.72428: checking to see if all hosts have failed and the running result is not ok 11124 1726882390.72429: done checking to see if all hosts have failed 11124 1726882390.72430: getting the remaining hosts for this loop 11124 1726882390.72431: done getting the remaining hosts for this loop 11124 1726882390.72434: getting the next task for host managed_node1 11124 1726882390.72440: done getting next task for host managed_node1 11124 1726882390.72444: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11124 1726882390.72448: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882390.72468: getting variables 11124 1726882390.72470: in VariableManager get_vars() 11124 1726882390.72510: Calling all_inventory to load vars for managed_node1 11124 1726882390.72512: Calling groups_inventory to load vars for managed_node1 11124 1726882390.72515: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882390.72524: Calling all_plugins_play to load vars for managed_node1 11124 1726882390.72527: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882390.72529: Calling groups_plugins_play to load vars for managed_node1 11124 1726882390.77508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882390.78812: done with get_vars() 11124 1726882390.78841: done getting variables 11124 1726882390.78896: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:33:10 -0400 (0:00:00.202) 0:00:31.031 ****** 11124 1726882390.78928: entering _queue_task() for managed_node1/service 11124 1726882390.79772: worker is 1 (out of 1 available) 11124 1726882390.79786: exiting _queue_task() for managed_node1/service 11124 1726882390.79798: done queuing things up, now waiting for results queue to drain 11124 1726882390.79800: waiting for pending results... 11124 1726882390.80390: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 11124 1726882390.80625: in run() - task 0e448fcc-3ce9-8362-0f62-00000000008b 11124 1726882390.80645: variable 'ansible_search_path' from source: unknown 11124 1726882390.80656: variable 'ansible_search_path' from source: unknown 11124 1726882390.80707: calling self._execute() 11124 1726882390.80815: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882390.80827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882390.80843: variable 'omit' from source: magic vars 11124 1726882390.81234: variable 'ansible_distribution_major_version' from source: facts 11124 1726882390.81256: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882390.81375: variable 'network_provider' from source: set_fact 11124 1726882390.81387: Evaluated conditional (network_provider == "initscripts"): False 11124 1726882390.81394: when evaluation is False, skipping this task 11124 1726882390.81401: _execute() done 11124 1726882390.81408: dumping result to json 11124 1726882390.81414: done dumping result, returning 11124 1726882390.81424: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0e448fcc-3ce9-8362-0f62-00000000008b] 11124 1726882390.81435: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008b 11124 1726882390.81557: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008b skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11124 1726882390.81610: no more pending results, returning what we have 11124 1726882390.81613: results queue empty 11124 1726882390.81614: checking for any_errors_fatal 11124 1726882390.81624: done checking for any_errors_fatal 11124 1726882390.81625: checking for max_fail_percentage 11124 1726882390.81627: done checking for max_fail_percentage 11124 1726882390.81628: checking to see if all hosts have failed and the running result is not ok 11124 1726882390.81629: done checking to see if all hosts have failed 11124 1726882390.81630: getting the remaining hosts for this loop 11124 1726882390.81632: done getting the remaining hosts for this loop 11124 1726882390.81636: getting the next task for host managed_node1 11124 1726882390.81644: done getting next task for host managed_node1 11124 1726882390.81649: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11124 1726882390.81656: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882390.81682: getting variables 11124 1726882390.81685: in VariableManager get_vars() 11124 1726882390.81729: Calling all_inventory to load vars for managed_node1 11124 1726882390.81732: Calling groups_inventory to load vars for managed_node1 11124 1726882390.81735: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882390.81749: Calling all_plugins_play to load vars for managed_node1 11124 1726882390.81755: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882390.81758: Calling groups_plugins_play to load vars for managed_node1 11124 1726882390.83045: WORKER PROCESS EXITING 11124 1726882390.83627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882390.85522: done with get_vars() 11124 1726882390.85552: done getting variables 11124 1726882390.85615: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:33:10 -0400 (0:00:00.067) 0:00:31.099 ****** 11124 1726882390.85657: entering _queue_task() for managed_node1/copy 11124 1726882390.86014: worker is 1 (out of 1 available) 11124 1726882390.86028: exiting _queue_task() for managed_node1/copy 11124 1726882390.86042: done queuing things up, now waiting for results queue to drain 11124 1726882390.86044: waiting for pending results... 11124 1726882390.86346: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11124 1726882390.86537: in run() - task 0e448fcc-3ce9-8362-0f62-00000000008c 11124 1726882390.86560: variable 'ansible_search_path' from source: unknown 11124 1726882390.86571: variable 'ansible_search_path' from source: unknown 11124 1726882390.86620: calling self._execute() 11124 1726882390.86726: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882390.86737: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882390.86753: variable 'omit' from source: magic vars 11124 1726882390.87130: variable 'ansible_distribution_major_version' from source: facts 11124 1726882390.87152: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882390.87279: variable 'network_provider' from source: set_fact 11124 1726882390.87289: Evaluated conditional (network_provider == "initscripts"): False 11124 1726882390.87295: when evaluation is False, skipping this task 11124 1726882390.87301: _execute() done 11124 1726882390.87307: dumping result to json 11124 1726882390.87314: done dumping result, returning 11124 1726882390.87326: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0e448fcc-3ce9-8362-0f62-00000000008c] 11124 1726882390.87336: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008c skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11124 1726882390.87511: no more pending results, returning what we have 11124 1726882390.87515: results queue empty 11124 1726882390.87517: checking for any_errors_fatal 11124 1726882390.87523: done checking for any_errors_fatal 11124 1726882390.87523: checking for max_fail_percentage 11124 1726882390.87525: done checking for max_fail_percentage 11124 1726882390.87526: checking to see if all hosts have failed and the running result is not ok 11124 1726882390.87527: done checking to see if all hosts have failed 11124 1726882390.87528: getting the remaining hosts for this loop 11124 1726882390.87530: done getting the remaining hosts for this loop 11124 1726882390.87534: getting the next task for host managed_node1 11124 1726882390.87541: done getting next task for host managed_node1 11124 1726882390.87547: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11124 1726882390.87554: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882390.87577: getting variables 11124 1726882390.87579: in VariableManager get_vars() 11124 1726882390.87622: Calling all_inventory to load vars for managed_node1 11124 1726882390.87625: Calling groups_inventory to load vars for managed_node1 11124 1726882390.87627: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882390.87640: Calling all_plugins_play to load vars for managed_node1 11124 1726882390.87643: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882390.87646: Calling groups_plugins_play to load vars for managed_node1 11124 1726882390.88681: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008c 11124 1726882390.88684: WORKER PROCESS EXITING 11124 1726882390.89393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882390.91110: done with get_vars() 11124 1726882390.91138: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:33:10 -0400 (0:00:00.055) 0:00:31.154 ****** 11124 1726882390.91232: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11124 1726882390.92047: worker is 1 (out of 1 available) 11124 1726882390.92064: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11124 1726882390.92079: done queuing things up, now waiting for results queue to drain 11124 1726882390.92081: waiting for pending results... 11124 1726882390.92948: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11124 1726882390.93343: in run() - task 0e448fcc-3ce9-8362-0f62-00000000008d 11124 1726882390.93368: variable 'ansible_search_path' from source: unknown 11124 1726882390.93405: variable 'ansible_search_path' from source: unknown 11124 1726882390.93452: calling self._execute() 11124 1726882390.93730: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882390.94377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882390.94421: variable 'omit' from source: magic vars 11124 1726882390.95120: variable 'ansible_distribution_major_version' from source: facts 11124 1726882390.95277: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882390.95289: variable 'omit' from source: magic vars 11124 1726882390.95370: variable 'omit' from source: magic vars 11124 1726882390.95657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11124 1726882391.01096: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11124 1726882391.01421: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11124 1726882391.01504: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11124 1726882391.01556: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11124 1726882391.01615: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11124 1726882391.01711: variable 'network_provider' from source: set_fact 11124 1726882391.01878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11124 1726882391.01916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11124 1726882391.01947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11124 1726882391.02004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11124 1726882391.02025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11124 1726882391.02114: variable 'omit' from source: magic vars 11124 1726882391.02245: variable 'omit' from source: magic vars 11124 1726882391.02365: variable 'network_connections' from source: task vars 11124 1726882391.02381: variable 'port2_profile' from source: play vars 11124 1726882391.02454: variable 'port2_profile' from source: play vars 11124 1726882391.02471: variable 'port1_profile' from source: play vars 11124 1726882391.02536: variable 'port1_profile' from source: play vars 11124 1726882391.02553: variable 'controller_profile' from source: play vars 11124 1726882391.02615: variable 'controller_profile' from source: play vars 11124 1726882391.02797: variable 'omit' from source: magic vars 11124 1726882391.02811: variable '__lsr_ansible_managed' from source: task vars 11124 1726882391.02881: variable '__lsr_ansible_managed' from source: task vars 11124 1726882391.03083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11124 1726882391.03502: Loaded config def from plugin (lookup/template) 11124 1726882391.03529: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11124 1726882391.03653: File lookup term: get_ansible_managed.j2 11124 1726882391.03673: variable 'ansible_search_path' from source: unknown 11124 1726882391.03689: evaluation_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11124 1726882391.03706: search_path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11124 1726882391.03839: variable 'ansible_search_path' from source: unknown 11124 1726882391.13888: variable 'ansible_managed' from source: unknown 11124 1726882391.14031: variable 'omit' from source: magic vars 11124 1726882391.14066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882391.14096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882391.14117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882391.14136: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882391.14149: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882391.14183: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882391.14191: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882391.14197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882391.14294: Set connection var ansible_shell_executable to /bin/sh 11124 1726882391.14307: Set connection var ansible_shell_type to sh 11124 1726882391.14318: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882391.14327: Set connection var ansible_timeout to 10 11124 1726882391.14335: Set connection var ansible_pipelining to False 11124 1726882391.14340: Set connection var ansible_connection to ssh 11124 1726882391.14366: variable 'ansible_shell_executable' from source: unknown 11124 1726882391.14374: variable 'ansible_connection' from source: unknown 11124 1726882391.14381: variable 'ansible_module_compression' from source: unknown 11124 1726882391.14388: variable 'ansible_shell_type' from source: unknown 11124 1726882391.14394: variable 'ansible_shell_executable' from source: unknown 11124 1726882391.14399: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882391.14405: variable 'ansible_pipelining' from source: unknown 11124 1726882391.14411: variable 'ansible_timeout' from source: unknown 11124 1726882391.14426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882391.14551: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882391.14568: variable 'omit' from source: magic vars 11124 1726882391.14578: starting attempt loop 11124 1726882391.14584: running the handler 11124 1726882391.14601: _low_level_execute_command(): starting 11124 1726882391.14611: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882391.15312: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882391.15326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.15339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.15356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.15401: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.15413: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882391.15425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.15441: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882391.15451: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882391.15462: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882391.15480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.15493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.15506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.15517: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.15526: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882391.15538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.15615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882391.15637: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882391.15651: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882391.15785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882391.17480: stdout chunk (state=3): >>>/root <<< 11124 1726882391.17683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882391.17687: stdout chunk (state=3): >>><<< 11124 1726882391.17689: stderr chunk (state=3): >>><<< 11124 1726882391.17807: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882391.17812: _low_level_execute_command(): starting 11124 1726882391.17815: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082 `" && echo ansible-tmp-1726882391.1770983-12549-104376920552082="` echo /root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082 `" ) && sleep 0' 11124 1726882391.18453: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882391.18465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.18476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.18490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.18528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.18535: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882391.18544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.18560: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882391.18570: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882391.18578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882391.18585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.18594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.18605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.18612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.18618: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882391.18627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.18704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882391.18722: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882391.18734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882391.18860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882391.20767: stdout chunk (state=3): >>>ansible-tmp-1726882391.1770983-12549-104376920552082=/root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082 <<< 11124 1726882391.20947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882391.20950: stdout chunk (state=3): >>><<< 11124 1726882391.20961: stderr chunk (state=3): >>><<< 11124 1726882391.20981: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882391.1770983-12549-104376920552082=/root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882391.21032: variable 'ansible_module_compression' from source: unknown 11124 1726882391.21087: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11124 1726882391.21119: variable 'ansible_facts' from source: unknown 11124 1726882391.21212: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082/AnsiballZ_network_connections.py 11124 1726882391.21428: Sending initial data 11124 1726882391.21432: Sent initial data (168 bytes) 11124 1726882391.22820: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882391.22838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.22852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.22871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.22912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.22924: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882391.22942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.22960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882391.22975: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882391.22987: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882391.23002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.23018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.23037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.23057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.23074: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882391.23091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.23173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882391.23196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882391.23212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882391.23334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882391.25108: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882391.25201: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882391.25289: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp9knq1x7j /root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082/AnsiballZ_network_connections.py <<< 11124 1726882391.25397: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882391.27549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882391.27747: stderr chunk (state=3): >>><<< 11124 1726882391.27750: stdout chunk (state=3): >>><<< 11124 1726882391.27753: done transferring module to remote 11124 1726882391.27755: _low_level_execute_command(): starting 11124 1726882391.27757: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082/ /root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082/AnsiballZ_network_connections.py && sleep 0' 11124 1726882391.28344: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882391.28359: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.28376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.28393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.28436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.28450: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882391.28467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.28485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882391.28495: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882391.28505: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882391.28515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.28526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.28541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.28552: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.28562: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882391.28577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.28653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882391.28673: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882391.28687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882391.28813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882391.30670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882391.30675: stdout chunk (state=3): >>><<< 11124 1726882391.30683: stderr chunk (state=3): >>><<< 11124 1726882391.30703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882391.30710: _low_level_execute_command(): starting 11124 1726882391.30714: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082/AnsiballZ_network_connections.py && sleep 0' 11124 1726882391.31385: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882391.31389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.31396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.31411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.31450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.31461: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882391.31477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.31490: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882391.31498: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882391.31504: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882391.31511: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.31520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.31531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.31538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882391.31545: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882391.31557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.31628: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882391.31643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882391.31646: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882391.31793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882391.83668: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11124 1726882391.83673: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/5930b051-7181-4874-ab99-c3feeaedcfbf: error=unknown <<< 11124 1726882391.85538: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11124 1726882391.85542: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/8ae45e77-4d55-4f5d-855c-022ad1860e4a: error=unknown <<< 11124 1726882391.87530: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a7e63e00-15bf-4393-a589-86c57260d9e4: error=unknown <<< 11124 1726882391.87720: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11124 1726882391.89304: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882391.89372: stderr chunk (state=3): >>><<< 11124 1726882391.89376: stdout chunk (state=3): >>><<< 11124 1726882391.89397: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/5930b051-7181-4874-ab99-c3feeaedcfbf: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/8ae45e77-4d55-4f5d-855c-022ad1860e4a: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_90_e0b3i/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a7e63e00-15bf-4393-a589-86c57260d9e4: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882391.89434: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882391.89441: _low_level_execute_command(): starting 11124 1726882391.89446: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882391.1770983-12549-104376920552082/ > /dev/null 2>&1 && sleep 0' 11124 1726882391.89918: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882391.89922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882391.89955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.89958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882391.89966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882391.90019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882391.90022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882391.90024: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882391.90122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882391.91973: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882391.92044: stderr chunk (state=3): >>><<< 11124 1726882391.92046: stdout chunk (state=3): >>><<< 11124 1726882391.92069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882391.92072: handler run complete 11124 1726882391.92094: attempt loop complete, returning result 11124 1726882391.92097: _execute() done 11124 1726882391.92099: dumping result to json 11124 1726882391.92104: done dumping result, returning 11124 1726882391.92113: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0e448fcc-3ce9-8362-0f62-00000000008d] 11124 1726882391.92117: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008d 11124 1726882391.92229: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008d 11124 1726882391.92232: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11124 1726882391.92346: no more pending results, returning what we have 11124 1726882391.92349: results queue empty 11124 1726882391.92350: checking for any_errors_fatal 11124 1726882391.92355: done checking for any_errors_fatal 11124 1726882391.92356: checking for max_fail_percentage 11124 1726882391.92357: done checking for max_fail_percentage 11124 1726882391.92358: checking to see if all hosts have failed and the running result is not ok 11124 1726882391.92359: done checking to see if all hosts have failed 11124 1726882391.92360: getting the remaining hosts for this loop 11124 1726882391.92361: done getting the remaining hosts for this loop 11124 1726882391.92368: getting the next task for host managed_node1 11124 1726882391.92374: done getting next task for host managed_node1 11124 1726882391.92377: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11124 1726882391.92381: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882391.92391: getting variables 11124 1726882391.92392: in VariableManager get_vars() 11124 1726882391.92429: Calling all_inventory to load vars for managed_node1 11124 1726882391.92432: Calling groups_inventory to load vars for managed_node1 11124 1726882391.92433: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882391.92443: Calling all_plugins_play to load vars for managed_node1 11124 1726882391.92451: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882391.92454: Calling groups_plugins_play to load vars for managed_node1 11124 1726882391.93398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882391.94567: done with get_vars() 11124 1726882391.94597: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:33:11 -0400 (0:00:01.034) 0:00:32.189 ****** 11124 1726882391.94691: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11124 1726882391.95035: worker is 1 (out of 1 available) 11124 1726882391.95048: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11124 1726882391.95060: done queuing things up, now waiting for results queue to drain 11124 1726882391.95061: waiting for pending results... 11124 1726882391.95355: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 11124 1726882391.95514: in run() - task 0e448fcc-3ce9-8362-0f62-00000000008e 11124 1726882391.95535: variable 'ansible_search_path' from source: unknown 11124 1726882391.95543: variable 'ansible_search_path' from source: unknown 11124 1726882391.95586: calling self._execute() 11124 1726882391.95699: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882391.95712: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882391.95730: variable 'omit' from source: magic vars 11124 1726882391.96129: variable 'ansible_distribution_major_version' from source: facts 11124 1726882391.96147: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882391.96278: variable 'network_state' from source: role '' defaults 11124 1726882391.96294: Evaluated conditional (network_state != {}): False 11124 1726882391.96307: when evaluation is False, skipping this task 11124 1726882391.96310: _execute() done 11124 1726882391.96313: dumping result to json 11124 1726882391.96315: done dumping result, returning 11124 1726882391.96322: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0e448fcc-3ce9-8362-0f62-00000000008e] 11124 1726882391.96327: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008e 11124 1726882391.96433: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008e 11124 1726882391.96437: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11124 1726882391.96492: no more pending results, returning what we have 11124 1726882391.96495: results queue empty 11124 1726882391.96496: checking for any_errors_fatal 11124 1726882391.96507: done checking for any_errors_fatal 11124 1726882391.96508: checking for max_fail_percentage 11124 1726882391.96510: done checking for max_fail_percentage 11124 1726882391.96511: checking to see if all hosts have failed and the running result is not ok 11124 1726882391.96512: done checking to see if all hosts have failed 11124 1726882391.96512: getting the remaining hosts for this loop 11124 1726882391.96514: done getting the remaining hosts for this loop 11124 1726882391.96517: getting the next task for host managed_node1 11124 1726882391.96524: done getting next task for host managed_node1 11124 1726882391.96528: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11124 1726882391.96533: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882391.96553: getting variables 11124 1726882391.96555: in VariableManager get_vars() 11124 1726882391.96617: Calling all_inventory to load vars for managed_node1 11124 1726882391.96620: Calling groups_inventory to load vars for managed_node1 11124 1726882391.96622: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882391.96633: Calling all_plugins_play to load vars for managed_node1 11124 1726882391.96635: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882391.96638: Calling groups_plugins_play to load vars for managed_node1 11124 1726882391.97482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882391.98675: done with get_vars() 11124 1726882391.98698: done getting variables 11124 1726882391.98765: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:33:11 -0400 (0:00:00.041) 0:00:32.230 ****** 11124 1726882391.98804: entering _queue_task() for managed_node1/debug 11124 1726882391.99158: worker is 1 (out of 1 available) 11124 1726882391.99172: exiting _queue_task() for managed_node1/debug 11124 1726882391.99183: done queuing things up, now waiting for results queue to drain 11124 1726882391.99185: waiting for pending results... 11124 1726882391.99481: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11124 1726882391.99593: in run() - task 0e448fcc-3ce9-8362-0f62-00000000008f 11124 1726882391.99605: variable 'ansible_search_path' from source: unknown 11124 1726882391.99609: variable 'ansible_search_path' from source: unknown 11124 1726882391.99657: calling self._execute() 11124 1726882391.99734: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882391.99743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882391.99762: variable 'omit' from source: magic vars 11124 1726882392.00127: variable 'ansible_distribution_major_version' from source: facts 11124 1726882392.00143: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882392.00155: variable 'omit' from source: magic vars 11124 1726882392.00230: variable 'omit' from source: magic vars 11124 1726882392.00273: variable 'omit' from source: magic vars 11124 1726882392.00321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882392.00360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882392.00389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882392.00409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882392.00424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882392.00456: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882392.00468: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.00477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.00575: Set connection var ansible_shell_executable to /bin/sh 11124 1726882392.00588: Set connection var ansible_shell_type to sh 11124 1726882392.00599: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882392.00607: Set connection var ansible_timeout to 10 11124 1726882392.00616: Set connection var ansible_pipelining to False 11124 1726882392.00621: Set connection var ansible_connection to ssh 11124 1726882392.00646: variable 'ansible_shell_executable' from source: unknown 11124 1726882392.00654: variable 'ansible_connection' from source: unknown 11124 1726882392.00662: variable 'ansible_module_compression' from source: unknown 11124 1726882392.00672: variable 'ansible_shell_type' from source: unknown 11124 1726882392.00678: variable 'ansible_shell_executable' from source: unknown 11124 1726882392.00684: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.00691: variable 'ansible_pipelining' from source: unknown 11124 1726882392.00697: variable 'ansible_timeout' from source: unknown 11124 1726882392.00704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.00853: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882392.00874: variable 'omit' from source: magic vars 11124 1726882392.00884: starting attempt loop 11124 1726882392.00891: running the handler 11124 1726882392.01028: variable '__network_connections_result' from source: set_fact 11124 1726882392.01087: handler run complete 11124 1726882392.01109: attempt loop complete, returning result 11124 1726882392.01116: _execute() done 11124 1726882392.01123: dumping result to json 11124 1726882392.01129: done dumping result, returning 11124 1726882392.01143: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0e448fcc-3ce9-8362-0f62-00000000008f] 11124 1726882392.01165: sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008f 11124 1726882392.01283: done sending task result for task 0e448fcc-3ce9-8362-0f62-00000000008f 11124 1726882392.01289: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 11124 1726882392.01373: no more pending results, returning what we have 11124 1726882392.01376: results queue empty 11124 1726882392.01377: checking for any_errors_fatal 11124 1726882392.01383: done checking for any_errors_fatal 11124 1726882392.01384: checking for max_fail_percentage 11124 1726882392.01385: done checking for max_fail_percentage 11124 1726882392.01386: checking to see if all hosts have failed and the running result is not ok 11124 1726882392.01387: done checking to see if all hosts have failed 11124 1726882392.01388: getting the remaining hosts for this loop 11124 1726882392.01390: done getting the remaining hosts for this loop 11124 1726882392.01393: getting the next task for host managed_node1 11124 1726882392.01399: done getting next task for host managed_node1 11124 1726882392.01405: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11124 1726882392.01409: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882392.01420: getting variables 11124 1726882392.01422: in VariableManager get_vars() 11124 1726882392.01462: Calling all_inventory to load vars for managed_node1 11124 1726882392.01466: Calling groups_inventory to load vars for managed_node1 11124 1726882392.01468: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882392.01480: Calling all_plugins_play to load vars for managed_node1 11124 1726882392.01484: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882392.01487: Calling groups_plugins_play to load vars for managed_node1 11124 1726882392.02319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882392.03430: done with get_vars() 11124 1726882392.03461: done getting variables 11124 1726882392.03525: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:33:12 -0400 (0:00:00.047) 0:00:32.278 ****** 11124 1726882392.03571: entering _queue_task() for managed_node1/debug 11124 1726882392.03908: worker is 1 (out of 1 available) 11124 1726882392.03921: exiting _queue_task() for managed_node1/debug 11124 1726882392.03934: done queuing things up, now waiting for results queue to drain 11124 1726882392.03935: waiting for pending results... 11124 1726882392.04230: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11124 1726882392.04390: in run() - task 0e448fcc-3ce9-8362-0f62-000000000090 11124 1726882392.04402: variable 'ansible_search_path' from source: unknown 11124 1726882392.04405: variable 'ansible_search_path' from source: unknown 11124 1726882392.04436: calling self._execute() 11124 1726882392.04530: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.04533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.04542: variable 'omit' from source: magic vars 11124 1726882392.04830: variable 'ansible_distribution_major_version' from source: facts 11124 1726882392.04840: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882392.04846: variable 'omit' from source: magic vars 11124 1726882392.04895: variable 'omit' from source: magic vars 11124 1726882392.04923: variable 'omit' from source: magic vars 11124 1726882392.04958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882392.04985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882392.05000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882392.05013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882392.05023: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882392.05048: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882392.05051: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.05057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.05124: Set connection var ansible_shell_executable to /bin/sh 11124 1726882392.05131: Set connection var ansible_shell_type to sh 11124 1726882392.05138: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882392.05148: Set connection var ansible_timeout to 10 11124 1726882392.05154: Set connection var ansible_pipelining to False 11124 1726882392.05160: Set connection var ansible_connection to ssh 11124 1726882392.05177: variable 'ansible_shell_executable' from source: unknown 11124 1726882392.05180: variable 'ansible_connection' from source: unknown 11124 1726882392.05183: variable 'ansible_module_compression' from source: unknown 11124 1726882392.05185: variable 'ansible_shell_type' from source: unknown 11124 1726882392.05188: variable 'ansible_shell_executable' from source: unknown 11124 1726882392.05190: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.05193: variable 'ansible_pipelining' from source: unknown 11124 1726882392.05196: variable 'ansible_timeout' from source: unknown 11124 1726882392.05200: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.05305: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882392.05314: variable 'omit' from source: magic vars 11124 1726882392.05319: starting attempt loop 11124 1726882392.05322: running the handler 11124 1726882392.05364: variable '__network_connections_result' from source: set_fact 11124 1726882392.05420: variable '__network_connections_result' from source: set_fact 11124 1726882392.05511: handler run complete 11124 1726882392.05529: attempt loop complete, returning result 11124 1726882392.05533: _execute() done 11124 1726882392.05535: dumping result to json 11124 1726882392.05538: done dumping result, returning 11124 1726882392.05546: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0e448fcc-3ce9-8362-0f62-000000000090] 11124 1726882392.05553: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000090 11124 1726882392.05643: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000090 11124 1726882392.05645: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11124 1726882392.05737: no more pending results, returning what we have 11124 1726882392.05740: results queue empty 11124 1726882392.05741: checking for any_errors_fatal 11124 1726882392.05749: done checking for any_errors_fatal 11124 1726882392.05750: checking for max_fail_percentage 11124 1726882392.05753: done checking for max_fail_percentage 11124 1726882392.05756: checking to see if all hosts have failed and the running result is not ok 11124 1726882392.05757: done checking to see if all hosts have failed 11124 1726882392.05757: getting the remaining hosts for this loop 11124 1726882392.05759: done getting the remaining hosts for this loop 11124 1726882392.05763: getting the next task for host managed_node1 11124 1726882392.05770: done getting next task for host managed_node1 11124 1726882392.05773: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11124 1726882392.05777: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882392.05787: getting variables 11124 1726882392.05788: in VariableManager get_vars() 11124 1726882392.05823: Calling all_inventory to load vars for managed_node1 11124 1726882392.05825: Calling groups_inventory to load vars for managed_node1 11124 1726882392.05827: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882392.05835: Calling all_plugins_play to load vars for managed_node1 11124 1726882392.05843: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882392.05846: Calling groups_plugins_play to load vars for managed_node1 11124 1726882392.06786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882392.07732: done with get_vars() 11124 1726882392.07755: done getting variables 11124 1726882392.07801: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:33:12 -0400 (0:00:00.042) 0:00:32.320 ****** 11124 1726882392.07829: entering _queue_task() for managed_node1/debug 11124 1726882392.08085: worker is 1 (out of 1 available) 11124 1726882392.08098: exiting _queue_task() for managed_node1/debug 11124 1726882392.08111: done queuing things up, now waiting for results queue to drain 11124 1726882392.08112: waiting for pending results... 11124 1726882392.08298: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11124 1726882392.08404: in run() - task 0e448fcc-3ce9-8362-0f62-000000000091 11124 1726882392.08416: variable 'ansible_search_path' from source: unknown 11124 1726882392.08419: variable 'ansible_search_path' from source: unknown 11124 1726882392.08454: calling self._execute() 11124 1726882392.08530: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.08535: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.08542: variable 'omit' from source: magic vars 11124 1726882392.08830: variable 'ansible_distribution_major_version' from source: facts 11124 1726882392.08840: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882392.08926: variable 'network_state' from source: role '' defaults 11124 1726882392.08935: Evaluated conditional (network_state != {}): False 11124 1726882392.08939: when evaluation is False, skipping this task 11124 1726882392.08941: _execute() done 11124 1726882392.08944: dumping result to json 11124 1726882392.08946: done dumping result, returning 11124 1726882392.08956: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0e448fcc-3ce9-8362-0f62-000000000091] 11124 1726882392.08959: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000091 11124 1726882392.09055: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000091 11124 1726882392.09058: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 11124 1726882392.09114: no more pending results, returning what we have 11124 1726882392.09117: results queue empty 11124 1726882392.09118: checking for any_errors_fatal 11124 1726882392.09127: done checking for any_errors_fatal 11124 1726882392.09127: checking for max_fail_percentage 11124 1726882392.09129: done checking for max_fail_percentage 11124 1726882392.09130: checking to see if all hosts have failed and the running result is not ok 11124 1726882392.09131: done checking to see if all hosts have failed 11124 1726882392.09132: getting the remaining hosts for this loop 11124 1726882392.09133: done getting the remaining hosts for this loop 11124 1726882392.09137: getting the next task for host managed_node1 11124 1726882392.09143: done getting next task for host managed_node1 11124 1726882392.09148: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11124 1726882392.09155: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882392.09181: getting variables 11124 1726882392.09183: in VariableManager get_vars() 11124 1726882392.09219: Calling all_inventory to load vars for managed_node1 11124 1726882392.09222: Calling groups_inventory to load vars for managed_node1 11124 1726882392.09224: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882392.09233: Calling all_plugins_play to load vars for managed_node1 11124 1726882392.09236: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882392.09238: Calling groups_plugins_play to load vars for managed_node1 11124 1726882392.10065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882392.11012: done with get_vars() 11124 1726882392.11036: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:33:12 -0400 (0:00:00.032) 0:00:32.353 ****** 11124 1726882392.11114: entering _queue_task() for managed_node1/ping 11124 1726882392.11361: worker is 1 (out of 1 available) 11124 1726882392.11375: exiting _queue_task() for managed_node1/ping 11124 1726882392.11388: done queuing things up, now waiting for results queue to drain 11124 1726882392.11390: waiting for pending results... 11124 1726882392.11585: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 11124 1726882392.11688: in run() - task 0e448fcc-3ce9-8362-0f62-000000000092 11124 1726882392.11699: variable 'ansible_search_path' from source: unknown 11124 1726882392.11703: variable 'ansible_search_path' from source: unknown 11124 1726882392.11733: calling self._execute() 11124 1726882392.11815: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.11820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.11829: variable 'omit' from source: magic vars 11124 1726882392.12111: variable 'ansible_distribution_major_version' from source: facts 11124 1726882392.12121: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882392.12128: variable 'omit' from source: magic vars 11124 1726882392.12174: variable 'omit' from source: magic vars 11124 1726882392.12197: variable 'omit' from source: magic vars 11124 1726882392.12232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882392.12261: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882392.12281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882392.12294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882392.12303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882392.12330: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882392.12333: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.12335: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.12407: Set connection var ansible_shell_executable to /bin/sh 11124 1726882392.12414: Set connection var ansible_shell_type to sh 11124 1726882392.12421: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882392.12428: Set connection var ansible_timeout to 10 11124 1726882392.12437: Set connection var ansible_pipelining to False 11124 1726882392.12439: Set connection var ansible_connection to ssh 11124 1726882392.12457: variable 'ansible_shell_executable' from source: unknown 11124 1726882392.12460: variable 'ansible_connection' from source: unknown 11124 1726882392.12463: variable 'ansible_module_compression' from source: unknown 11124 1726882392.12467: variable 'ansible_shell_type' from source: unknown 11124 1726882392.12470: variable 'ansible_shell_executable' from source: unknown 11124 1726882392.12472: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.12476: variable 'ansible_pipelining' from source: unknown 11124 1726882392.12478: variable 'ansible_timeout' from source: unknown 11124 1726882392.12482: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.12641: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11124 1726882392.12649: variable 'omit' from source: magic vars 11124 1726882392.12659: starting attempt loop 11124 1726882392.12663: running the handler 11124 1726882392.12676: _low_level_execute_command(): starting 11124 1726882392.12682: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882392.13223: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.13239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.13259: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.13275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.13320: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.13332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.13446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.15129: stdout chunk (state=3): >>>/root <<< 11124 1726882392.15284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.15344: stderr chunk (state=3): >>><<< 11124 1726882392.15347: stdout chunk (state=3): >>><<< 11124 1726882392.15471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882392.15475: _low_level_execute_command(): starting 11124 1726882392.15480: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426 `" && echo ansible-tmp-1726882392.1537366-12597-268875263459426="` echo /root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426 `" ) && sleep 0' 11124 1726882392.15962: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882392.15983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.15986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.16026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.16037: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.16039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.16041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882392.16044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.16084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.16100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.16111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.16224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.18095: stdout chunk (state=3): >>>ansible-tmp-1726882392.1537366-12597-268875263459426=/root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426 <<< 11124 1726882392.18195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.18250: stderr chunk (state=3): >>><<< 11124 1726882392.18254: stdout chunk (state=3): >>><<< 11124 1726882392.18277: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882392.1537366-12597-268875263459426=/root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882392.18315: variable 'ansible_module_compression' from source: unknown 11124 1726882392.18350: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11124 1726882392.18385: variable 'ansible_facts' from source: unknown 11124 1726882392.18438: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426/AnsiballZ_ping.py 11124 1726882392.18549: Sending initial data 11124 1726882392.18552: Sent initial data (153 bytes) 11124 1726882392.19230: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.19238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.19289: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882392.19293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.19296: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.19342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.19351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.19463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.21214: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11124 1726882392.21230: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11124 1726882392.21241: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11124 1726882392.21252: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11124 1726882392.21275: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11124 1726882392.21297: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 11124 1726882392.21316: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 11124 1726882392.21345: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 11124 1726882392.21358: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882392.21479: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882392.21577: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpunnvydwm /root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426/AnsiballZ_ping.py <<< 11124 1726882392.21670: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882392.22818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.22969: stderr chunk (state=3): >>><<< 11124 1726882392.22972: stdout chunk (state=3): >>><<< 11124 1726882392.22975: done transferring module to remote 11124 1726882392.22977: _low_level_execute_command(): starting 11124 1726882392.22979: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426/ /root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426/AnsiballZ_ping.py && sleep 0' 11124 1726882392.23387: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.23390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.23426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.23431: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.23433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.23483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.23486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.23587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.25385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.25410: stderr chunk (state=3): >>><<< 11124 1726882392.25413: stdout chunk (state=3): >>><<< 11124 1726882392.25431: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882392.25435: _low_level_execute_command(): starting 11124 1726882392.25439: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426/AnsiballZ_ping.py && sleep 0' 11124 1726882392.26091: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882392.26099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.26109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.26122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.26161: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.26181: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882392.26191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.26203: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882392.26211: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882392.26217: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882392.26224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.26233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.26244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.26250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.26262: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882392.26272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.26349: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.26370: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.26381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.26517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.39405: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11124 1726882392.40475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882392.40480: stdout chunk (state=3): >>><<< 11124 1726882392.40484: stderr chunk (state=3): >>><<< 11124 1726882392.40506: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882392.40532: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882392.40541: _low_level_execute_command(): starting 11124 1726882392.40546: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882392.1537366-12597-268875263459426/ > /dev/null 2>&1 && sleep 0' 11124 1726882392.41226: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882392.41235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.41245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.41266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.41316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.41323: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882392.41334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.41347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882392.41357: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882392.41365: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882392.41375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.41384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.41403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.41410: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.41417: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882392.41426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.41503: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.41529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.41542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.41684: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.43604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.43609: stdout chunk (state=3): >>><<< 11124 1726882392.43613: stderr chunk (state=3): >>><<< 11124 1726882392.43634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882392.43640: handler run complete 11124 1726882392.43661: attempt loop complete, returning result 11124 1726882392.43667: _execute() done 11124 1726882392.43672: dumping result to json 11124 1726882392.43677: done dumping result, returning 11124 1726882392.43686: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0e448fcc-3ce9-8362-0f62-000000000092] 11124 1726882392.43691: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000092 11124 1726882392.43788: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000092 11124 1726882392.43790: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 11124 1726882392.43932: no more pending results, returning what we have 11124 1726882392.43935: results queue empty 11124 1726882392.43937: checking for any_errors_fatal 11124 1726882392.43943: done checking for any_errors_fatal 11124 1726882392.43944: checking for max_fail_percentage 11124 1726882392.43946: done checking for max_fail_percentage 11124 1726882392.43947: checking to see if all hosts have failed and the running result is not ok 11124 1726882392.43948: done checking to see if all hosts have failed 11124 1726882392.43949: getting the remaining hosts for this loop 11124 1726882392.43953: done getting the remaining hosts for this loop 11124 1726882392.43957: getting the next task for host managed_node1 11124 1726882392.43974: done getting next task for host managed_node1 11124 1726882392.43977: ^ task is: TASK: meta (role_complete) 11124 1726882392.43981: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882392.43995: getting variables 11124 1726882392.43998: in VariableManager get_vars() 11124 1726882392.44042: Calling all_inventory to load vars for managed_node1 11124 1726882392.44045: Calling groups_inventory to load vars for managed_node1 11124 1726882392.44048: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882392.44062: Calling all_plugins_play to load vars for managed_node1 11124 1726882392.44068: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882392.44071: Calling groups_plugins_play to load vars for managed_node1 11124 1726882392.45842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882392.47730: done with get_vars() 11124 1726882392.47776: done getting variables 11124 1726882392.47892: done queuing things up, now waiting for results queue to drain 11124 1726882392.47895: results queue empty 11124 1726882392.47895: checking for any_errors_fatal 11124 1726882392.47899: done checking for any_errors_fatal 11124 1726882392.47899: checking for max_fail_percentage 11124 1726882392.47900: done checking for max_fail_percentage 11124 1726882392.47901: checking to see if all hosts have failed and the running result is not ok 11124 1726882392.47902: done checking to see if all hosts have failed 11124 1726882392.47903: getting the remaining hosts for this loop 11124 1726882392.47904: done getting the remaining hosts for this loop 11124 1726882392.47906: getting the next task for host managed_node1 11124 1726882392.47911: done getting next task for host managed_node1 11124 1726882392.47913: ^ task is: TASK: Delete the device '{{ controller_device }}' 11124 1726882392.47915: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882392.47917: getting variables 11124 1726882392.47918: in VariableManager get_vars() 11124 1726882392.47933: Calling all_inventory to load vars for managed_node1 11124 1726882392.47935: Calling groups_inventory to load vars for managed_node1 11124 1726882392.47937: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882392.47946: Calling all_plugins_play to load vars for managed_node1 11124 1726882392.47949: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882392.47954: Calling groups_plugins_play to load vars for managed_node1 11124 1726882392.49485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882392.52026: done with get_vars() 11124 1726882392.52047: done getting variables 11124 1726882392.52103: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11124 1726882392.52234: variable 'controller_device' from source: play vars TASK [Delete the device 'deprecated-bond'] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 Friday 20 September 2024 21:33:12 -0400 (0:00:00.411) 0:00:32.765 ****** 11124 1726882392.52270: entering _queue_task() for managed_node1/command 11124 1726882392.53399: worker is 1 (out of 1 available) 11124 1726882392.53412: exiting _queue_task() for managed_node1/command 11124 1726882392.53425: done queuing things up, now waiting for results queue to drain 11124 1726882392.53427: waiting for pending results... 11124 1726882392.53541: running TaskExecutor() for managed_node1/TASK: Delete the device 'deprecated-bond' 11124 1726882392.53850: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000c2 11124 1726882392.53877: variable 'ansible_search_path' from source: unknown 11124 1726882392.53920: calling self._execute() 11124 1726882392.54021: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.54032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.54044: variable 'omit' from source: magic vars 11124 1726882392.54422: variable 'ansible_distribution_major_version' from source: facts 11124 1726882392.54439: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882392.54452: variable 'omit' from source: magic vars 11124 1726882392.54478: variable 'omit' from source: magic vars 11124 1726882392.54583: variable 'controller_device' from source: play vars 11124 1726882392.54607: variable 'omit' from source: magic vars 11124 1726882392.54661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882392.54702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882392.54734: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882392.54759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882392.54781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882392.54815: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882392.54824: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.54832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.54937: Set connection var ansible_shell_executable to /bin/sh 11124 1726882392.54949: Set connection var ansible_shell_type to sh 11124 1726882392.54969: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882392.54980: Set connection var ansible_timeout to 10 11124 1726882392.54990: Set connection var ansible_pipelining to False 11124 1726882392.54996: Set connection var ansible_connection to ssh 11124 1726882392.55020: variable 'ansible_shell_executable' from source: unknown 11124 1726882392.55027: variable 'ansible_connection' from source: unknown 11124 1726882392.55033: variable 'ansible_module_compression' from source: unknown 11124 1726882392.55039: variable 'ansible_shell_type' from source: unknown 11124 1726882392.55045: variable 'ansible_shell_executable' from source: unknown 11124 1726882392.55050: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882392.55057: variable 'ansible_pipelining' from source: unknown 11124 1726882392.55068: variable 'ansible_timeout' from source: unknown 11124 1726882392.55077: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882392.55215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882392.55233: variable 'omit' from source: magic vars 11124 1726882392.55242: starting attempt loop 11124 1726882392.55249: running the handler 11124 1726882392.55270: _low_level_execute_command(): starting 11124 1726882392.55287: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882392.56039: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882392.56056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.56072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.56089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.56126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.56139: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882392.56154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.56180: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882392.56193: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882392.56205: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882392.56218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.56232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.56248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.56261: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.56278: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882392.56290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.56362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.56393: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.56409: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.56537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.58180: stdout chunk (state=3): >>>/root <<< 11124 1726882392.58369: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.58373: stdout chunk (state=3): >>><<< 11124 1726882392.58386: stderr chunk (state=3): >>><<< 11124 1726882392.58503: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882392.58508: _low_level_execute_command(): starting 11124 1726882392.58511: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323 `" && echo ansible-tmp-1726882392.584073-12620-253870728951323="` echo /root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323 `" ) && sleep 0' 11124 1726882392.60132: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.60136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.60170: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882392.60173: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 11124 1726882392.60187: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.60190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.60369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.60382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.60444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.60549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.62421: stdout chunk (state=3): >>>ansible-tmp-1726882392.584073-12620-253870728951323=/root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323 <<< 11124 1726882392.62639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.62643: stdout chunk (state=3): >>><<< 11124 1726882392.62650: stderr chunk (state=3): >>><<< 11124 1726882392.62675: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882392.584073-12620-253870728951323=/root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882392.62708: variable 'ansible_module_compression' from source: unknown 11124 1726882392.62769: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882392.62808: variable 'ansible_facts' from source: unknown 11124 1726882392.62891: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323/AnsiballZ_command.py 11124 1726882392.63082: Sending initial data 11124 1726882392.63085: Sent initial data (155 bytes) 11124 1726882392.65388: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882392.65395: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.65406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.65420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.65470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.65477: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882392.65487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.65500: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882392.65507: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882392.65513: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882392.65521: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.65530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.65540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.65548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.65569: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882392.65577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.65645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.65661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.65677: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.65798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.67522: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882392.67607: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882392.67702: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmprbuwzwoq /root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323/AnsiballZ_command.py <<< 11124 1726882392.67790: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882392.69452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.69575: stderr chunk (state=3): >>><<< 11124 1726882392.69579: stdout chunk (state=3): >>><<< 11124 1726882392.69582: done transferring module to remote 11124 1726882392.69585: _low_level_execute_command(): starting 11124 1726882392.69587: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323/ /root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323/AnsiballZ_command.py && sleep 0' 11124 1726882392.71317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882392.71458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.71477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.71491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.71533: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.71539: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882392.71550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.71571: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882392.71578: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882392.71585: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882392.71592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.71602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.71681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.71688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.71695: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882392.71704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.71897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.71918: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.71928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.72053: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.73881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.73897: stderr chunk (state=3): >>><<< 11124 1726882392.73900: stdout chunk (state=3): >>><<< 11124 1726882392.73919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882392.73922: _low_level_execute_command(): starting 11124 1726882392.73928: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323/AnsiballZ_command.py && sleep 0' 11124 1726882392.75099: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.75103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.76008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882392.76012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 11124 1726882392.76025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.76030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.76042: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882392.76047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.76121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.76135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.76140: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.76273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.90137: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-20 21:33:12.892696", "end": "2024-09-20 21:33:12.899849", "delta": "0:00:00.007153", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882392.91230: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. <<< 11124 1726882392.91235: stderr chunk (state=3): >>><<< 11124 1726882392.91237: stdout chunk (state=3): >>><<< 11124 1726882392.91262: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-20 21:33:12.892696", "end": "2024-09-20 21:33:12.899849", "delta": "0:00:00.007153", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.44.90 closed. 11124 1726882392.91307: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882392.91316: _low_level_execute_command(): starting 11124 1726882392.91318: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882392.584073-12620-253870728951323/ > /dev/null 2>&1 && sleep 0' 11124 1726882392.92552: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882392.92557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.92570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.92583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.92625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.92632: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882392.92642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.92655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882392.92665: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882392.92672: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882392.92680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882392.92689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882392.92701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882392.92708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882392.92716: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882392.92728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882392.92801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882392.92820: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882392.92836: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882392.92961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882392.94785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882392.94864: stderr chunk (state=3): >>><<< 11124 1726882392.94868: stdout chunk (state=3): >>><<< 11124 1726882392.94887: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882392.94893: handler run complete 11124 1726882392.94918: Evaluated conditional (False): False 11124 1726882392.94921: Evaluated conditional (False): False 11124 1726882392.94934: attempt loop complete, returning result 11124 1726882392.94937: _execute() done 11124 1726882392.94939: dumping result to json 11124 1726882392.94945: done dumping result, returning 11124 1726882392.94955: done running TaskExecutor() for managed_node1/TASK: Delete the device 'deprecated-bond' [0e448fcc-3ce9-8362-0f62-0000000000c2] 11124 1726882392.94959: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c2 11124 1726882392.95083: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c2 11124 1726882392.95087: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "deprecated-bond" ], "delta": "0:00:00.007153", "end": "2024-09-20 21:33:12.899849", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:33:12.892696" } STDERR: Cannot find device "deprecated-bond" MSG: non-zero return code 11124 1726882392.95149: no more pending results, returning what we have 11124 1726882392.95155: results queue empty 11124 1726882392.95156: checking for any_errors_fatal 11124 1726882392.95158: done checking for any_errors_fatal 11124 1726882392.95158: checking for max_fail_percentage 11124 1726882392.95160: done checking for max_fail_percentage 11124 1726882392.95161: checking to see if all hosts have failed and the running result is not ok 11124 1726882392.95162: done checking to see if all hosts have failed 11124 1726882392.95163: getting the remaining hosts for this loop 11124 1726882392.95166: done getting the remaining hosts for this loop 11124 1726882392.95170: getting the next task for host managed_node1 11124 1726882392.95178: done getting next task for host managed_node1 11124 1726882392.95181: ^ task is: TASK: Remove test interfaces 11124 1726882392.95185: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882392.95190: getting variables 11124 1726882392.95192: in VariableManager get_vars() 11124 1726882392.95232: Calling all_inventory to load vars for managed_node1 11124 1726882392.95235: Calling groups_inventory to load vars for managed_node1 11124 1726882392.95237: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882392.95248: Calling all_plugins_play to load vars for managed_node1 11124 1726882392.95253: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882392.95255: Calling groups_plugins_play to load vars for managed_node1 11124 1726882392.97230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882393.00001: done with get_vars() 11124 1726882393.00034: done getting variables 11124 1726882393.00121: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:33:13 -0400 (0:00:00.478) 0:00:33.244 ****** 11124 1726882393.00160: entering _queue_task() for managed_node1/shell 11124 1726882393.00540: worker is 1 (out of 1 available) 11124 1726882393.00555: exiting _queue_task() for managed_node1/shell 11124 1726882393.00568: done queuing things up, now waiting for results queue to drain 11124 1726882393.00569: waiting for pending results... 11124 1726882393.01478: running TaskExecutor() for managed_node1/TASK: Remove test interfaces 11124 1726882393.01623: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000c6 11124 1726882393.01642: variable 'ansible_search_path' from source: unknown 11124 1726882393.01650: variable 'ansible_search_path' from source: unknown 11124 1726882393.01691: calling self._execute() 11124 1726882393.01791: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882393.01802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882393.01821: variable 'omit' from source: magic vars 11124 1726882393.02186: variable 'ansible_distribution_major_version' from source: facts 11124 1726882393.02201: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882393.02214: variable 'omit' from source: magic vars 11124 1726882393.02278: variable 'omit' from source: magic vars 11124 1726882393.02446: variable 'dhcp_interface1' from source: play vars 11124 1726882393.02457: variable 'dhcp_interface2' from source: play vars 11124 1726882393.02483: variable 'omit' from source: magic vars 11124 1726882393.02529: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882393.02572: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882393.02597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882393.02619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882393.02635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882393.02678: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882393.02690: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882393.02699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882393.02808: Set connection var ansible_shell_executable to /bin/sh 11124 1726882393.02820: Set connection var ansible_shell_type to sh 11124 1726882393.02831: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882393.02840: Set connection var ansible_timeout to 10 11124 1726882393.02848: Set connection var ansible_pipelining to False 11124 1726882393.02855: Set connection var ansible_connection to ssh 11124 1726882393.02884: variable 'ansible_shell_executable' from source: unknown 11124 1726882393.02892: variable 'ansible_connection' from source: unknown 11124 1726882393.02903: variable 'ansible_module_compression' from source: unknown 11124 1726882393.02910: variable 'ansible_shell_type' from source: unknown 11124 1726882393.02917: variable 'ansible_shell_executable' from source: unknown 11124 1726882393.02923: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882393.02931: variable 'ansible_pipelining' from source: unknown 11124 1726882393.02938: variable 'ansible_timeout' from source: unknown 11124 1726882393.02946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882393.03098: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882393.03114: variable 'omit' from source: magic vars 11124 1726882393.03125: starting attempt loop 11124 1726882393.03132: running the handler 11124 1726882393.03146: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882393.03172: _low_level_execute_command(): starting 11124 1726882393.03197: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882393.04146: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882393.04172: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.04204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.04226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.04278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.04290: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882393.04308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.04327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882393.04338: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882393.04349: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882393.04361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.04376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.04389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.04399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.04409: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882393.04425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.04502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.04531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.04547: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.04691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.06294: stdout chunk (state=3): >>>/root <<< 11124 1726882393.06400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.06466: stderr chunk (state=3): >>><<< 11124 1726882393.06469: stdout chunk (state=3): >>><<< 11124 1726882393.06485: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882393.06499: _low_level_execute_command(): starting 11124 1726882393.06505: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250 `" && echo ansible-tmp-1726882393.0648377-12641-246301796651250="` echo /root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250 `" ) && sleep 0' 11124 1726882393.06963: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.06973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.07017: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.07021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.07023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.07084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.07087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.07190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.09055: stdout chunk (state=3): >>>ansible-tmp-1726882393.0648377-12641-246301796651250=/root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250 <<< 11124 1726882393.09504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.09508: stderr chunk (state=3): >>><<< 11124 1726882393.09510: stdout chunk (state=3): >>><<< 11124 1726882393.09513: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882393.0648377-12641-246301796651250=/root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882393.09515: variable 'ansible_module_compression' from source: unknown 11124 1726882393.09517: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882393.09519: variable 'ansible_facts' from source: unknown 11124 1726882393.09520: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250/AnsiballZ_command.py 11124 1726882393.09846: Sending initial data 11124 1726882393.09852: Sent initial data (156 bytes) 11124 1726882393.10547: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882393.10553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.10556: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.10558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.10561: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.10564: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882393.10566: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.10673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882393.10677: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882393.10680: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882393.10682: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.10684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.10686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.10688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.10690: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882393.10692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.10766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.10770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.10772: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.10858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.12575: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882393.12660: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882393.12749: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp1o6bk3zr /root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250/AnsiballZ_command.py <<< 11124 1726882393.12837: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882393.13989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.14169: stderr chunk (state=3): >>><<< 11124 1726882393.14172: stdout chunk (state=3): >>><<< 11124 1726882393.14175: done transferring module to remote 11124 1726882393.14177: _low_level_execute_command(): starting 11124 1726882393.14179: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250/ /root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250/AnsiballZ_command.py && sleep 0' 11124 1726882393.14698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.14703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.14756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.14759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.14761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882393.14766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.14812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.14816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.14918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.16683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.16753: stderr chunk (state=3): >>><<< 11124 1726882393.16756: stdout chunk (state=3): >>><<< 11124 1726882393.16779: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882393.16782: _low_level_execute_command(): starting 11124 1726882393.16784: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250/AnsiballZ_command.py && sleep 0' 11124 1726882393.17432: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882393.17441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.17453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.17466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.17503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.17521: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882393.17531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.17545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882393.17555: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882393.17558: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882393.17576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.17579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.17591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.17599: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.17606: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882393.17617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.17698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.17713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.17716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.17855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.36585: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:33:13.308054", "end": "2024-09-20 21:33:13.364462", "delta": "0:00:00.056408", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882393.37885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882393.38175: stderr chunk (state=3): >>><<< 11124 1726882393.38179: stdout chunk (state=3): >>><<< 11124 1726882393.38182: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:33:13.308054", "end": "2024-09-20 21:33:13.364462", "delta": "0:00:00.056408", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882393.38190: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882393.38193: _low_level_execute_command(): starting 11124 1726882393.38195: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882393.0648377-12641-246301796651250/ > /dev/null 2>&1 && sleep 0' 11124 1726882393.38704: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882393.38713: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.38723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.38737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.38780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.38786: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882393.38797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.38809: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882393.38816: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882393.38822: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882393.38830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.38840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.38849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.38861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.38869: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882393.38879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.38952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.38976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.38989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.39114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.40913: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.40966: stderr chunk (state=3): >>><<< 11124 1726882393.40970: stdout chunk (state=3): >>><<< 11124 1726882393.40985: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882393.40992: handler run complete 11124 1726882393.41009: Evaluated conditional (False): False 11124 1726882393.41018: attempt loop complete, returning result 11124 1726882393.41020: _execute() done 11124 1726882393.41023: dumping result to json 11124 1726882393.41032: done dumping result, returning 11124 1726882393.41040: done running TaskExecutor() for managed_node1/TASK: Remove test interfaces [0e448fcc-3ce9-8362-0f62-0000000000c6] 11124 1726882393.41043: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c6 11124 1726882393.41142: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c6 11124 1726882393.41146: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.056408", "end": "2024-09-20 21:33:13.364462", "rc": 0, "start": "2024-09-20 21:33:13.308054" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11124 1726882393.41213: no more pending results, returning what we have 11124 1726882393.41216: results queue empty 11124 1726882393.41217: checking for any_errors_fatal 11124 1726882393.41225: done checking for any_errors_fatal 11124 1726882393.41226: checking for max_fail_percentage 11124 1726882393.41228: done checking for max_fail_percentage 11124 1726882393.41229: checking to see if all hosts have failed and the running result is not ok 11124 1726882393.41230: done checking to see if all hosts have failed 11124 1726882393.41230: getting the remaining hosts for this loop 11124 1726882393.41232: done getting the remaining hosts for this loop 11124 1726882393.41235: getting the next task for host managed_node1 11124 1726882393.41241: done getting next task for host managed_node1 11124 1726882393.41243: ^ task is: TASK: Stop dnsmasq/radvd services 11124 1726882393.41247: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882393.41276: getting variables 11124 1726882393.41278: in VariableManager get_vars() 11124 1726882393.41323: Calling all_inventory to load vars for managed_node1 11124 1726882393.41326: Calling groups_inventory to load vars for managed_node1 11124 1726882393.41328: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882393.41338: Calling all_plugins_play to load vars for managed_node1 11124 1726882393.41341: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882393.41343: Calling groups_plugins_play to load vars for managed_node1 11124 1726882393.42767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882393.44365: done with get_vars() 11124 1726882393.44388: done getting variables 11124 1726882393.44433: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:33:13 -0400 (0:00:00.442) 0:00:33.687 ****** 11124 1726882393.44461: entering _queue_task() for managed_node1/shell 11124 1726882393.44706: worker is 1 (out of 1 available) 11124 1726882393.44718: exiting _queue_task() for managed_node1/shell 11124 1726882393.44731: done queuing things up, now waiting for results queue to drain 11124 1726882393.44733: waiting for pending results... 11124 1726882393.44910: running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services 11124 1726882393.45001: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000c7 11124 1726882393.45011: variable 'ansible_search_path' from source: unknown 11124 1726882393.45014: variable 'ansible_search_path' from source: unknown 11124 1726882393.45047: calling self._execute() 11124 1726882393.45122: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882393.45126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882393.45134: variable 'omit' from source: magic vars 11124 1726882393.45404: variable 'ansible_distribution_major_version' from source: facts 11124 1726882393.45414: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882393.45421: variable 'omit' from source: magic vars 11124 1726882393.45461: variable 'omit' from source: magic vars 11124 1726882393.45486: variable 'omit' from source: magic vars 11124 1726882393.45519: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882393.45545: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882393.45565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882393.45578: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882393.45587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882393.45611: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882393.45614: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882393.45617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882393.45690: Set connection var ansible_shell_executable to /bin/sh 11124 1726882393.45696: Set connection var ansible_shell_type to sh 11124 1726882393.45703: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882393.45708: Set connection var ansible_timeout to 10 11124 1726882393.45712: Set connection var ansible_pipelining to False 11124 1726882393.45715: Set connection var ansible_connection to ssh 11124 1726882393.45733: variable 'ansible_shell_executable' from source: unknown 11124 1726882393.45736: variable 'ansible_connection' from source: unknown 11124 1726882393.45738: variable 'ansible_module_compression' from source: unknown 11124 1726882393.45742: variable 'ansible_shell_type' from source: unknown 11124 1726882393.45744: variable 'ansible_shell_executable' from source: unknown 11124 1726882393.45747: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882393.45749: variable 'ansible_pipelining' from source: unknown 11124 1726882393.45754: variable 'ansible_timeout' from source: unknown 11124 1726882393.45756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882393.45857: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882393.45869: variable 'omit' from source: magic vars 11124 1726882393.45872: starting attempt loop 11124 1726882393.45874: running the handler 11124 1726882393.45883: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882393.45899: _low_level_execute_command(): starting 11124 1726882393.45905: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882393.46630: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882393.46641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.46654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.46673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.46706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.46713: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882393.46723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.46735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882393.46744: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882393.46749: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882393.46757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.46768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.46782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.46788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.46793: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882393.46804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.46876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.46903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.46908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.47047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.48627: stdout chunk (state=3): >>>/root <<< 11124 1726882393.48724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.48782: stderr chunk (state=3): >>><<< 11124 1726882393.48785: stdout chunk (state=3): >>><<< 11124 1726882393.48806: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882393.48817: _low_level_execute_command(): starting 11124 1726882393.48829: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347 `" && echo ansible-tmp-1726882393.4880545-12669-152336638768347="` echo /root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347 `" ) && sleep 0' 11124 1726882393.49277: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.49285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.49320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.49325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.49335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.49340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.49347: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.49352: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882393.49361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.49426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.49429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.49529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.51380: stdout chunk (state=3): >>>ansible-tmp-1726882393.4880545-12669-152336638768347=/root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347 <<< 11124 1726882393.51488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.51532: stderr chunk (state=3): >>><<< 11124 1726882393.51535: stdout chunk (state=3): >>><<< 11124 1726882393.51554: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882393.4880545-12669-152336638768347=/root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882393.51587: variable 'ansible_module_compression' from source: unknown 11124 1726882393.51628: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882393.51663: variable 'ansible_facts' from source: unknown 11124 1726882393.51724: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347/AnsiballZ_command.py 11124 1726882393.51834: Sending initial data 11124 1726882393.51837: Sent initial data (156 bytes) 11124 1726882393.52513: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.52518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.52554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found <<< 11124 1726882393.52557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.52560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882393.52562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.52616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.52620: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.52626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.52717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.54435: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882393.54524: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882393.54616: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpmkuuxu6a /root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347/AnsiballZ_command.py <<< 11124 1726882393.54706: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882393.55684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.55783: stderr chunk (state=3): >>><<< 11124 1726882393.55787: stdout chunk (state=3): >>><<< 11124 1726882393.55802: done transferring module to remote 11124 1726882393.55811: _low_level_execute_command(): starting 11124 1726882393.55816: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347/ /root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347/AnsiballZ_command.py && sleep 0' 11124 1726882393.56252: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.56262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.56306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.56310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.56312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.56366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.56376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.56379: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.56488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.58208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.58255: stderr chunk (state=3): >>><<< 11124 1726882393.58258: stdout chunk (state=3): >>><<< 11124 1726882393.58272: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882393.58275: _low_level_execute_command(): starting 11124 1726882393.58280: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347/AnsiballZ_command.py && sleep 0' 11124 1726882393.58709: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.58714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.58761: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.58768: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.58771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.58826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.58829: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.58835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.58931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.73973: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:33:13.718321", "end": "2024-09-20 21:33:13.738085", "delta": "0:00:00.019764", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882393.75224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882393.75228: stderr chunk (state=3): >>><<< 11124 1726882393.75233: stdout chunk (state=3): >>><<< 11124 1726882393.75256: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:33:13.718321", "end": "2024-09-20 21:33:13.738085", "delta": "0:00:00.019764", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882393.75302: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882393.75310: _low_level_execute_command(): starting 11124 1726882393.75315: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882393.4880545-12669-152336638768347/ > /dev/null 2>&1 && sleep 0' 11124 1726882393.76662: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882393.76668: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.76670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.76673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.76675: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.76677: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882393.76687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.76703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882393.76706: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882393.76709: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882393.76711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882393.76714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882393.76716: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882393.76719: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882393.76721: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882393.76724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882393.76726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882393.76729: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882393.76731: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882393.76856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882393.78681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882393.78755: stderr chunk (state=3): >>><<< 11124 1726882393.78758: stdout chunk (state=3): >>><<< 11124 1726882393.78775: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882393.78782: handler run complete 11124 1726882393.78806: Evaluated conditional (False): False 11124 1726882393.78817: attempt loop complete, returning result 11124 1726882393.78820: _execute() done 11124 1726882393.78822: dumping result to json 11124 1726882393.78828: done dumping result, returning 11124 1726882393.78836: done running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services [0e448fcc-3ce9-8362-0f62-0000000000c7] 11124 1726882393.78841: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c7 11124 1726882393.78949: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c7 11124 1726882393.78954: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.019764", "end": "2024-09-20 21:33:13.738085", "rc": 0, "start": "2024-09-20 21:33:13.718321" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11124 1726882393.79033: no more pending results, returning what we have 11124 1726882393.79036: results queue empty 11124 1726882393.79038: checking for any_errors_fatal 11124 1726882393.79048: done checking for any_errors_fatal 11124 1726882393.79049: checking for max_fail_percentage 11124 1726882393.79050: done checking for max_fail_percentage 11124 1726882393.79051: checking to see if all hosts have failed and the running result is not ok 11124 1726882393.79052: done checking to see if all hosts have failed 11124 1726882393.79053: getting the remaining hosts for this loop 11124 1726882393.79055: done getting the remaining hosts for this loop 11124 1726882393.79058: getting the next task for host managed_node1 11124 1726882393.79069: done getting next task for host managed_node1 11124 1726882393.79072: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 11124 1726882393.79076: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882393.79086: getting variables 11124 1726882393.79088: in VariableManager get_vars() 11124 1726882393.79130: Calling all_inventory to load vars for managed_node1 11124 1726882393.79132: Calling groups_inventory to load vars for managed_node1 11124 1726882393.79135: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882393.79147: Calling all_plugins_play to load vars for managed_node1 11124 1726882393.79150: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882393.79152: Calling groups_plugins_play to load vars for managed_node1 11124 1726882393.82426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882393.84604: done with get_vars() 11124 1726882393.84635: done getting variables 11124 1726882393.84700: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:131 Friday 20 September 2024 21:33:13 -0400 (0:00:00.402) 0:00:34.089 ****** 11124 1726882393.84730: entering _queue_task() for managed_node1/command 11124 1726882393.85061: worker is 1 (out of 1 available) 11124 1726882393.85077: exiting _queue_task() for managed_node1/command 11124 1726882393.85090: done queuing things up, now waiting for results queue to drain 11124 1726882393.85091: waiting for pending results... 11124 1726882393.85369: running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript 11124 1726882393.85458: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000c8 11124 1726882393.85471: variable 'ansible_search_path' from source: unknown 11124 1726882393.85507: calling self._execute() 11124 1726882393.85602: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882393.85606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882393.85616: variable 'omit' from source: magic vars 11124 1726882393.85989: variable 'ansible_distribution_major_version' from source: facts 11124 1726882393.86000: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882393.86116: variable 'network_provider' from source: set_fact 11124 1726882393.86120: Evaluated conditional (network_provider == "initscripts"): False 11124 1726882393.86123: when evaluation is False, skipping this task 11124 1726882393.86126: _execute() done 11124 1726882393.86128: dumping result to json 11124 1726882393.86132: done dumping result, returning 11124 1726882393.86138: done running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript [0e448fcc-3ce9-8362-0f62-0000000000c8] 11124 1726882393.86144: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c8 11124 1726882393.86233: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c8 11124 1726882393.86236: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11124 1726882393.86286: no more pending results, returning what we have 11124 1726882393.86290: results queue empty 11124 1726882393.86291: checking for any_errors_fatal 11124 1726882393.86304: done checking for any_errors_fatal 11124 1726882393.86305: checking for max_fail_percentage 11124 1726882393.86307: done checking for max_fail_percentage 11124 1726882393.86308: checking to see if all hosts have failed and the running result is not ok 11124 1726882393.86309: done checking to see if all hosts have failed 11124 1726882393.86310: getting the remaining hosts for this loop 11124 1726882393.86311: done getting the remaining hosts for this loop 11124 1726882393.86315: getting the next task for host managed_node1 11124 1726882393.86322: done getting next task for host managed_node1 11124 1726882393.86325: ^ task is: TASK: Verify network state restored to default 11124 1726882393.86329: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882393.86333: getting variables 11124 1726882393.86335: in VariableManager get_vars() 11124 1726882393.86379: Calling all_inventory to load vars for managed_node1 11124 1726882393.86382: Calling groups_inventory to load vars for managed_node1 11124 1726882393.86384: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882393.86399: Calling all_plugins_play to load vars for managed_node1 11124 1726882393.86402: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882393.86405: Calling groups_plugins_play to load vars for managed_node1 11124 1726882393.88099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882393.89684: done with get_vars() 11124 1726882393.89708: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:136 Friday 20 September 2024 21:33:13 -0400 (0:00:00.050) 0:00:34.140 ****** 11124 1726882393.89807: entering _queue_task() for managed_node1/include_tasks 11124 1726882393.90098: worker is 1 (out of 1 available) 11124 1726882393.90111: exiting _queue_task() for managed_node1/include_tasks 11124 1726882393.90124: done queuing things up, now waiting for results queue to drain 11124 1726882393.90126: waiting for pending results... 11124 1726882393.91009: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 11124 1726882393.91472: in run() - task 0e448fcc-3ce9-8362-0f62-0000000000c9 11124 1726882393.91476: variable 'ansible_search_path' from source: unknown 11124 1726882393.91478: calling self._execute() 11124 1726882393.91595: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882393.91599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882393.91610: variable 'omit' from source: magic vars 11124 1726882393.92407: variable 'ansible_distribution_major_version' from source: facts 11124 1726882393.92411: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882393.92413: _execute() done 11124 1726882393.92416: dumping result to json 11124 1726882393.92418: done dumping result, returning 11124 1726882393.92420: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0e448fcc-3ce9-8362-0f62-0000000000c9] 11124 1726882393.92629: sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c9 11124 1726882393.92724: done sending task result for task 0e448fcc-3ce9-8362-0f62-0000000000c9 11124 1726882393.92727: WORKER PROCESS EXITING 11124 1726882393.92755: no more pending results, returning what we have 11124 1726882393.92761: in VariableManager get_vars() 11124 1726882393.92810: Calling all_inventory to load vars for managed_node1 11124 1726882393.92814: Calling groups_inventory to load vars for managed_node1 11124 1726882393.92816: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882393.92830: Calling all_plugins_play to load vars for managed_node1 11124 1726882393.92834: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882393.92836: Calling groups_plugins_play to load vars for managed_node1 11124 1726882393.94493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882393.96333: done with get_vars() 11124 1726882393.96354: variable 'ansible_search_path' from source: unknown 11124 1726882393.96372: we have included files to process 11124 1726882393.96374: generating all_blocks data 11124 1726882393.96376: done generating all_blocks data 11124 1726882393.96381: processing included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11124 1726882393.96382: loading included file: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11124 1726882393.96384: Loading data from /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11124 1726882393.96789: done processing included file 11124 1726882393.96792: iterating over new_blocks loaded from include file 11124 1726882393.96793: in VariableManager get_vars() 11124 1726882393.96811: done with get_vars() 11124 1726882393.96813: filtering new block on tags 11124 1726882393.96845: done filtering new block on tags 11124 1726882393.96848: done iterating over new_blocks loaded from include file included: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 11124 1726882393.96852: extending task lists for all hosts with included blocks 11124 1726882393.98073: done extending task lists 11124 1726882393.98075: done processing included files 11124 1726882393.98076: results queue empty 11124 1726882393.98076: checking for any_errors_fatal 11124 1726882393.98080: done checking for any_errors_fatal 11124 1726882393.98081: checking for max_fail_percentage 11124 1726882393.98082: done checking for max_fail_percentage 11124 1726882393.98083: checking to see if all hosts have failed and the running result is not ok 11124 1726882393.98084: done checking to see if all hosts have failed 11124 1726882393.98084: getting the remaining hosts for this loop 11124 1726882393.98085: done getting the remaining hosts for this loop 11124 1726882393.98088: getting the next task for host managed_node1 11124 1726882393.98092: done getting next task for host managed_node1 11124 1726882393.98094: ^ task is: TASK: Check routes and DNS 11124 1726882393.98097: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882393.98100: getting variables 11124 1726882393.98101: in VariableManager get_vars() 11124 1726882393.98114: Calling all_inventory to load vars for managed_node1 11124 1726882393.98116: Calling groups_inventory to load vars for managed_node1 11124 1726882393.98118: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882393.98124: Calling all_plugins_play to load vars for managed_node1 11124 1726882393.98127: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882393.98129: Calling groups_plugins_play to load vars for managed_node1 11124 1726882393.99292: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882394.02660: done with get_vars() 11124 1726882394.02690: done getting variables 11124 1726882394.02735: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:33:14 -0400 (0:00:00.129) 0:00:34.270 ****** 11124 1726882394.02768: entering _queue_task() for managed_node1/shell 11124 1726882394.03064: worker is 1 (out of 1 available) 11124 1726882394.03077: exiting _queue_task() for managed_node1/shell 11124 1726882394.03089: done queuing things up, now waiting for results queue to drain 11124 1726882394.03091: waiting for pending results... 11124 1726882394.03354: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 11124 1726882394.03479: in run() - task 0e448fcc-3ce9-8362-0f62-000000000570 11124 1726882394.03497: variable 'ansible_search_path' from source: unknown 11124 1726882394.03504: variable 'ansible_search_path' from source: unknown 11124 1726882394.03545: calling self._execute() 11124 1726882394.03638: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882394.03654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882394.03674: variable 'omit' from source: magic vars 11124 1726882394.04028: variable 'ansible_distribution_major_version' from source: facts 11124 1726882394.04044: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882394.04054: variable 'omit' from source: magic vars 11124 1726882394.04110: variable 'omit' from source: magic vars 11124 1726882394.04147: variable 'omit' from source: magic vars 11124 1726882394.04197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882394.04232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882394.04254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882394.04281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882394.04299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882394.04329: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882394.04337: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882394.04344: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882394.04444: Set connection var ansible_shell_executable to /bin/sh 11124 1726882394.04456: Set connection var ansible_shell_type to sh 11124 1726882394.04469: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882394.04478: Set connection var ansible_timeout to 10 11124 1726882394.04486: Set connection var ansible_pipelining to False 11124 1726882394.04492: Set connection var ansible_connection to ssh 11124 1726882394.04523: variable 'ansible_shell_executable' from source: unknown 11124 1726882394.04530: variable 'ansible_connection' from source: unknown 11124 1726882394.04537: variable 'ansible_module_compression' from source: unknown 11124 1726882394.04542: variable 'ansible_shell_type' from source: unknown 11124 1726882394.04548: variable 'ansible_shell_executable' from source: unknown 11124 1726882394.04554: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882394.04561: variable 'ansible_pipelining' from source: unknown 11124 1726882394.04569: variable 'ansible_timeout' from source: unknown 11124 1726882394.04576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882394.04712: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882394.04731: variable 'omit' from source: magic vars 11124 1726882394.04739: starting attempt loop 11124 1726882394.04745: running the handler 11124 1726882394.04758: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882394.04783: _low_level_execute_command(): starting 11124 1726882394.04794: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882394.05543: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.05557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.05574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.05593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.05636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.05647: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.05662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.05684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.05695: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.05707: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.05719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.05731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.05745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.05757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.05770: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.05784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.05865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.05889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.05904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.06035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.07712: stdout chunk (state=3): >>>/root <<< 11124 1726882394.07878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882394.07883: stdout chunk (state=3): >>><<< 11124 1726882394.07892: stderr chunk (state=3): >>><<< 11124 1726882394.07915: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882394.07927: _low_level_execute_command(): starting 11124 1726882394.07938: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491 `" && echo ansible-tmp-1726882394.0791478-12689-245354492728491="` echo /root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491 `" ) && sleep 0' 11124 1726882394.08735: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.08756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.08775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.08794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.08838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.08855: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.08873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.08896: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.08908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.08919: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.08931: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.08945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.08965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.08980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.08996: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.09010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.09090: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.09117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.09133: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.09265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.11155: stdout chunk (state=3): >>>ansible-tmp-1726882394.0791478-12689-245354492728491=/root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491 <<< 11124 1726882394.11278: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882394.11348: stderr chunk (state=3): >>><<< 11124 1726882394.11351: stdout chunk (state=3): >>><<< 11124 1726882394.11378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882394.0791478-12689-245354492728491=/root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882394.11410: variable 'ansible_module_compression' from source: unknown 11124 1726882394.11473: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882394.11509: variable 'ansible_facts' from source: unknown 11124 1726882394.11601: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491/AnsiballZ_command.py 11124 1726882394.12228: Sending initial data 11124 1726882394.12231: Sent initial data (156 bytes) 11124 1726882394.13696: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.13703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.13744: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.13750: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration <<< 11124 1726882394.13765: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.13772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found <<< 11124 1726882394.13777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.13858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.13867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.13879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.14040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.15734: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882394.15824: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882394.15920: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmpqkf_ntbh /root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491/AnsiballZ_command.py <<< 11124 1726882394.16009: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882394.17398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882394.17570: stderr chunk (state=3): >>><<< 11124 1726882394.17573: stdout chunk (state=3): >>><<< 11124 1726882394.17697: done transferring module to remote 11124 1726882394.17700: _low_level_execute_command(): starting 11124 1726882394.17703: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491/ /root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491/AnsiballZ_command.py && sleep 0' 11124 1726882394.19381: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.19398: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.19413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.19434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.19509: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.19558: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.19576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.19595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.19607: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.19619: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.19632: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.19647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.19678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.19779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.19792: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.19807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.19895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.19912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.19992: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.20126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.21991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882394.22050: stderr chunk (state=3): >>><<< 11124 1726882394.22056: stdout chunk (state=3): >>><<< 11124 1726882394.22161: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882394.22168: _low_level_execute_command(): starting 11124 1726882394.22171: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491/AnsiballZ_command.py && sleep 0' 11124 1726882394.23490: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.23496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.23507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.23521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.23559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.23879: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.23890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.23904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.23911: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.23918: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.23925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.23934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.23946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.23955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.23960: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.23972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.24044: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.24066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.24079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.24208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.38148: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3077sec preferred_lft 3077sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:33:14.371352", "end": "2024-09-20 21:33:14.379918", "delta": "0:00:00.008566", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882394.39382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882394.39386: stdout chunk (state=3): >>><<< 11124 1726882394.39388: stderr chunk (state=3): >>><<< 11124 1726882394.39412: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0\n valid_lft 3077sec preferred_lft 3077sec\n inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 \n10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:33:14.371352", "end": "2024-09-20 21:33:14.379918", "delta": "0:00:00.008566", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882394.39458: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882394.39468: _low_level_execute_command(): starting 11124 1726882394.39473: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882394.0791478-12689-245354492728491/ > /dev/null 2>&1 && sleep 0' 11124 1726882394.41178: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.41188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.41198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.41212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.41282: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.41320: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.41329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.41342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.41374: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.41381: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.41389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.41402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.41413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.41460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.41469: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.41479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.41556: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.41706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.41720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.41856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.43759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882394.43763: stdout chunk (state=3): >>><<< 11124 1726882394.43767: stderr chunk (state=3): >>><<< 11124 1726882394.43795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882394.43803: handler run complete 11124 1726882394.43826: Evaluated conditional (False): False 11124 1726882394.43836: attempt loop complete, returning result 11124 1726882394.43839: _execute() done 11124 1726882394.43841: dumping result to json 11124 1726882394.43848: done dumping result, returning 11124 1726882394.43857: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0e448fcc-3ce9-8362-0f62-000000000570] 11124 1726882394.43862: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000570 11124 1726882394.43978: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000570 11124 1726882394.43981: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008566", "end": "2024-09-20 21:33:14.379918", "rc": 0, "start": "2024-09-20 21:33:14.371352" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 02:9e:a1:0b:f9:6d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.44.90/22 brd 10.31.47.255 scope global dynamic noprefixroute eth0 valid_lft 3077sec preferred_lft 3077sec inet6 fe80::9e:a1ff:fe0b:f96d/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.44.1 dev eth0 proto dhcp src 10.31.44.90 metric 100 10.31.44.0/22 dev eth0 proto kernel scope link src 10.31.44.90 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11124 1726882394.44144: no more pending results, returning what we have 11124 1726882394.44148: results queue empty 11124 1726882394.44149: checking for any_errors_fatal 11124 1726882394.44151: done checking for any_errors_fatal 11124 1726882394.44152: checking for max_fail_percentage 11124 1726882394.44153: done checking for max_fail_percentage 11124 1726882394.44154: checking to see if all hosts have failed and the running result is not ok 11124 1726882394.44156: done checking to see if all hosts have failed 11124 1726882394.44156: getting the remaining hosts for this loop 11124 1726882394.44158: done getting the remaining hosts for this loop 11124 1726882394.44162: getting the next task for host managed_node1 11124 1726882394.44172: done getting next task for host managed_node1 11124 1726882394.44176: ^ task is: TASK: Verify DNS and network connectivity 11124 1726882394.44180: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11124 1726882394.44187: getting variables 11124 1726882394.44189: in VariableManager get_vars() 11124 1726882394.44233: Calling all_inventory to load vars for managed_node1 11124 1726882394.44236: Calling groups_inventory to load vars for managed_node1 11124 1726882394.44239: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882394.44251: Calling all_plugins_play to load vars for managed_node1 11124 1726882394.44254: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882394.44257: Calling groups_plugins_play to load vars for managed_node1 11124 1726882394.46459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882394.48372: done with get_vars() 11124 1726882394.48405: done getting variables 11124 1726882394.48479: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:33:14 -0400 (0:00:00.457) 0:00:34.727 ****** 11124 1726882394.48511: entering _queue_task() for managed_node1/shell 11124 1726882394.48888: worker is 1 (out of 1 available) 11124 1726882394.48901: exiting _queue_task() for managed_node1/shell 11124 1726882394.48913: done queuing things up, now waiting for results queue to drain 11124 1726882394.48915: waiting for pending results... 11124 1726882394.49338: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 11124 1726882394.49758: in run() - task 0e448fcc-3ce9-8362-0f62-000000000571 11124 1726882394.49773: variable 'ansible_search_path' from source: unknown 11124 1726882394.49777: variable 'ansible_search_path' from source: unknown 11124 1726882394.49824: calling self._execute() 11124 1726882394.49925: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882394.49929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882394.49940: variable 'omit' from source: magic vars 11124 1726882394.50369: variable 'ansible_distribution_major_version' from source: facts 11124 1726882394.50385: Evaluated conditional (ansible_distribution_major_version != '6'): True 11124 1726882394.50530: variable 'ansible_facts' from source: unknown 11124 1726882394.51395: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11124 1726882394.51399: variable 'omit' from source: magic vars 11124 1726882394.52146: variable 'omit' from source: magic vars 11124 1726882394.52180: variable 'omit' from source: magic vars 11124 1726882394.52371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11124 1726882394.52399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11124 1726882394.52422: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11124 1726882394.52436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882394.52567: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11124 1726882394.52597: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11124 1726882394.52600: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882394.52603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882394.52753: Set connection var ansible_shell_executable to /bin/sh 11124 1726882394.52770: Set connection var ansible_shell_type to sh 11124 1726882394.52789: Set connection var ansible_module_compression to ZIP_DEFLATED 11124 1726882394.52802: Set connection var ansible_timeout to 10 11124 1726882394.52832: Set connection var ansible_pipelining to False 11124 1726882394.52839: Set connection var ansible_connection to ssh 11124 1726882394.52872: variable 'ansible_shell_executable' from source: unknown 11124 1726882394.52880: variable 'ansible_connection' from source: unknown 11124 1726882394.52887: variable 'ansible_module_compression' from source: unknown 11124 1726882394.52893: variable 'ansible_shell_type' from source: unknown 11124 1726882394.52898: variable 'ansible_shell_executable' from source: unknown 11124 1726882394.52904: variable 'ansible_host' from source: host vars for 'managed_node1' 11124 1726882394.52911: variable 'ansible_pipelining' from source: unknown 11124 1726882394.52918: variable 'ansible_timeout' from source: unknown 11124 1726882394.52926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11124 1726882394.53077: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882394.53095: variable 'omit' from source: magic vars 11124 1726882394.53104: starting attempt loop 11124 1726882394.53112: running the handler 11124 1726882394.53126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11124 1726882394.53147: _low_level_execute_command(): starting 11124 1726882394.53162: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11124 1726882394.53916: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.53930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.53943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.53967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.54009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.54020: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.54033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.54049: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.54066: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.54076: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.54088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.54102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.54117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.54128: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.54139: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.54154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.54232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.54248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.54268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.54397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.56080: stdout chunk (state=3): >>>/root <<< 11124 1726882394.56261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882394.56596: stdout chunk (state=3): >>><<< 11124 1726882394.56599: stderr chunk (state=3): >>><<< 11124 1726882394.56603: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882394.56613: _low_level_execute_command(): starting 11124 1726882394.56616: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007 `" && echo ansible-tmp-1726882394.5630128-12715-148991777868007="` echo /root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007 `" ) && sleep 0' 11124 1726882394.56962: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.57553: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.57572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.57587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.57627: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.57635: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.57645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.57662: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.57682: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.57693: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.57702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.57711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.57721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.57729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.57736: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.57745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.57824: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.57840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.57843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.57993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.59887: stdout chunk (state=3): >>>ansible-tmp-1726882394.5630128-12715-148991777868007=/root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007 <<< 11124 1726882394.60082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882394.60086: stdout chunk (state=3): >>><<< 11124 1726882394.60093: stderr chunk (state=3): >>><<< 11124 1726882394.60113: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882394.5630128-12715-148991777868007=/root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882394.60150: variable 'ansible_module_compression' from source: unknown 11124 1726882394.60248: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11124tk8rt4bo/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11124 1726882394.60286: variable 'ansible_facts' from source: unknown 11124 1726882394.60389: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007/AnsiballZ_command.py 11124 1726882394.61027: Sending initial data 11124 1726882394.61031: Sent initial data (156 bytes) 11124 1726882394.62801: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.62828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.62841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.62859: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.62900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.62907: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.62919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.62935: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.62948: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.62957: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.62967: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.62978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.62995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.63010: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.63017: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.63026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.63111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.63126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.63136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.63273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.65021: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11124 1726882394.65105: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 11124 1726882394.65202: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11124tk8rt4bo/tmp7xn2twtb /root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007/AnsiballZ_command.py <<< 11124 1726882394.65294: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 11124 1726882394.66657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882394.66983: stderr chunk (state=3): >>><<< 11124 1726882394.66987: stdout chunk (state=3): >>><<< 11124 1726882394.66990: done transferring module to remote 11124 1726882394.66992: _low_level_execute_command(): starting 11124 1726882394.66995: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007/ /root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007/AnsiballZ_command.py && sleep 0' 11124 1726882394.67681: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.67697: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.67713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.67732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.67786: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.67797: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.67810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.67827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.67838: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.67859: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.67875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.67888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.67903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.67914: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.67924: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.67936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.68017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.68038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.68057: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.68205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.70000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882394.70106: stderr chunk (state=3): >>><<< 11124 1726882394.70117: stdout chunk (state=3): >>><<< 11124 1726882394.70238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882394.70242: _low_level_execute_command(): starting 11124 1726882394.70245: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007/AnsiballZ_command.py && sleep 0' 11124 1726882394.71021: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882394.71036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.71053: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.71075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.71127: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.71143: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882394.71179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.71197: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882394.71209: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882394.71227: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882394.71243: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882394.71261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882394.71280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882394.71293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882394.71304: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882394.71318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882394.71425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882394.71456: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882394.71477: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882394.71608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882394.99552: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 5648 0 --:--:-- --:--:-- --:--:-- 5648\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4157 0 --:--:-- --:--:-- --:--:-- 4157", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:33:14.845794", "end": "2024-09-20 21:33:14.993962", "delta": "0:00:00.148168", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11124 1726882395.00935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. <<< 11124 1726882395.00939: stdout chunk (state=3): >>><<< 11124 1726882395.00942: stderr chunk (state=3): >>><<< 11124 1726882395.01094: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 5648 0 --:--:-- --:--:-- --:--:-- 5648\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4157 0 --:--:-- --:--:-- --:--:-- 4157", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:33:14.845794", "end": "2024-09-20 21:33:14.993962", "delta": "0:00:00.148168", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.44.90 closed. 11124 1726882395.01097: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11124 1726882395.01100: _low_level_execute_command(): starting 11124 1726882395.01103: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882394.5630128-12715-148991777868007/ > /dev/null 2>&1 && sleep 0' 11124 1726882395.01646: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11124 1726882395.01662: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882395.01681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882395.01707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882395.01940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882395.01953: stderr chunk (state=3): >>>debug2: match not found <<< 11124 1726882395.01974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882395.01993: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11124 1726882395.02006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.44.90 is address <<< 11124 1726882395.02018: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11124 1726882395.02031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11124 1726882395.02044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11124 1726882395.02061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11124 1726882395.02081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 <<< 11124 1726882395.02093: stderr chunk (state=3): >>>debug2: match found <<< 11124 1726882395.02107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11124 1726882395.02182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 11124 1726882395.02205: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11124 1726882395.02222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11124 1726882395.02350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11124 1726882395.04291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11124 1726882395.04307: stderr chunk (state=3): >>><<< 11124 1726882395.04310: stdout chunk (state=3): >>><<< 11124 1726882395.04331: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.44.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.44.90 originally 10.31.44.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11124 1726882395.04338: handler run complete 11124 1726882395.04367: Evaluated conditional (False): False 11124 1726882395.04378: attempt loop complete, returning result 11124 1726882395.04381: _execute() done 11124 1726882395.04383: dumping result to json 11124 1726882395.04390: done dumping result, returning 11124 1726882395.04398: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0e448fcc-3ce9-8362-0f62-000000000571] 11124 1726882395.04405: sending task result for task 0e448fcc-3ce9-8362-0f62-000000000571 11124 1726882395.04518: done sending task result for task 0e448fcc-3ce9-8362-0f62-000000000571 11124 1726882395.04521: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.148168", "end": "2024-09-20 21:33:14.993962", "rc": 0, "start": "2024-09-20 21:33:14.845794" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 5648 0 --:--:-- --:--:-- --:--:-- 5648 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 4157 0 --:--:-- --:--:-- --:--:-- 4157 11124 1726882395.04655: no more pending results, returning what we have 11124 1726882395.04661: results queue empty 11124 1726882395.04662: checking for any_errors_fatal 11124 1726882395.04676: done checking for any_errors_fatal 11124 1726882395.04678: checking for max_fail_percentage 11124 1726882395.04680: done checking for max_fail_percentage 11124 1726882395.04681: checking to see if all hosts have failed and the running result is not ok 11124 1726882395.04684: done checking to see if all hosts have failed 11124 1726882395.04685: getting the remaining hosts for this loop 11124 1726882395.04686: done getting the remaining hosts for this loop 11124 1726882395.04690: getting the next task for host managed_node1 11124 1726882395.04701: done getting next task for host managed_node1 11124 1726882395.04704: ^ task is: TASK: meta (flush_handlers) 11124 1726882395.04706: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882395.04711: getting variables 11124 1726882395.04713: in VariableManager get_vars() 11124 1726882395.04759: Calling all_inventory to load vars for managed_node1 11124 1726882395.04762: Calling groups_inventory to load vars for managed_node1 11124 1726882395.04767: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882395.04782: Calling all_plugins_play to load vars for managed_node1 11124 1726882395.04786: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882395.04789: Calling groups_plugins_play to load vars for managed_node1 11124 1726882395.06819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882395.12812: done with get_vars() 11124 1726882395.12833: done getting variables 11124 1726882395.12879: in VariableManager get_vars() 11124 1726882395.12890: Calling all_inventory to load vars for managed_node1 11124 1726882395.12891: Calling groups_inventory to load vars for managed_node1 11124 1726882395.12893: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882395.12896: Calling all_plugins_play to load vars for managed_node1 11124 1726882395.12898: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882395.12899: Calling groups_plugins_play to load vars for managed_node1 11124 1726882395.13562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882395.14873: done with get_vars() 11124 1726882395.14906: done queuing things up, now waiting for results queue to drain 11124 1726882395.14908: results queue empty 11124 1726882395.14909: checking for any_errors_fatal 11124 1726882395.14913: done checking for any_errors_fatal 11124 1726882395.14914: checking for max_fail_percentage 11124 1726882395.14915: done checking for max_fail_percentage 11124 1726882395.14916: checking to see if all hosts have failed and the running result is not ok 11124 1726882395.14916: done checking to see if all hosts have failed 11124 1726882395.14917: getting the remaining hosts for this loop 11124 1726882395.14918: done getting the remaining hosts for this loop 11124 1726882395.14921: getting the next task for host managed_node1 11124 1726882395.14925: done getting next task for host managed_node1 11124 1726882395.14926: ^ task is: TASK: meta (flush_handlers) 11124 1726882395.14928: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882395.14936: getting variables 11124 1726882395.14937: in VariableManager get_vars() 11124 1726882395.14952: Calling all_inventory to load vars for managed_node1 11124 1726882395.14954: Calling groups_inventory to load vars for managed_node1 11124 1726882395.14956: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882395.14962: Calling all_plugins_play to load vars for managed_node1 11124 1726882395.14966: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882395.14969: Calling groups_plugins_play to load vars for managed_node1 11124 1726882395.16259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882395.18230: done with get_vars() 11124 1726882395.18259: done getting variables 11124 1726882395.18313: in VariableManager get_vars() 11124 1726882395.18328: Calling all_inventory to load vars for managed_node1 11124 1726882395.18330: Calling groups_inventory to load vars for managed_node1 11124 1726882395.18332: Calling all_plugins_inventory to load vars for managed_node1 11124 1726882395.18337: Calling all_plugins_play to load vars for managed_node1 11124 1726882395.18340: Calling groups_plugins_inventory to load vars for managed_node1 11124 1726882395.18342: Calling groups_plugins_play to load vars for managed_node1 11124 1726882395.19565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11124 1726882395.21354: done with get_vars() 11124 1726882395.21390: done queuing things up, now waiting for results queue to drain 11124 1726882395.21392: results queue empty 11124 1726882395.21393: checking for any_errors_fatal 11124 1726882395.21395: done checking for any_errors_fatal 11124 1726882395.21396: checking for max_fail_percentage 11124 1726882395.21397: done checking for max_fail_percentage 11124 1726882395.21397: checking to see if all hosts have failed and the running result is not ok 11124 1726882395.21398: done checking to see if all hosts have failed 11124 1726882395.21399: getting the remaining hosts for this loop 11124 1726882395.21400: done getting the remaining hosts for this loop 11124 1726882395.21403: getting the next task for host managed_node1 11124 1726882395.21406: done getting next task for host managed_node1 11124 1726882395.21407: ^ task is: None 11124 1726882395.21409: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11124 1726882395.21410: done queuing things up, now waiting for results queue to drain 11124 1726882395.21411: results queue empty 11124 1726882395.21411: checking for any_errors_fatal 11124 1726882395.21412: done checking for any_errors_fatal 11124 1726882395.21413: checking for max_fail_percentage 11124 1726882395.21413: done checking for max_fail_percentage 11124 1726882395.21414: checking to see if all hosts have failed and the running result is not ok 11124 1726882395.21415: done checking to see if all hosts have failed 11124 1726882395.21416: getting the next task for host managed_node1 11124 1726882395.21419: done getting next task for host managed_node1 11124 1726882395.21419: ^ task is: None 11124 1726882395.21421: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=76 changed=3 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Friday 20 September 2024 21:33:15 -0400 (0:00:00.729) 0:00:35.457 ****** =============================================================================== Install dnsmasq --------------------------------------------------------- 3.66s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Gathering Facts --------------------------------------------------------- 2.00s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 fedora.linux_system_roles.network : Check which services are running ---- 1.96s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.88s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 fedora.linux_system_roles.network : Check which services are running ---- 1.58s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install pgrep, sysctl --------------------------------------------------- 1.36s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.21s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.03s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.89s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.84s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 0.83s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gather the minimum subset of ansible_facts required by the network role test --- 0.77s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Verify DNS and network connectivity ------------------------------------- 0.73s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.69s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 0.63s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.50s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Get NM profile info ----------------------------------------------------- 0.49s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Delete the device 'deprecated-bond' ------------------------------------- 0.48s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 Check routes and DNS ---------------------------------------------------- 0.46s /tmp/collections-Xyq/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 11124 1726882395.21521: RUNNING CLEANUP