[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11683 1726853246.68270: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Qi7 executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11683 1726853246.68555: Added group all to inventory 11683 1726853246.68556: Added group ungrouped to inventory 11683 1726853246.68559: Group all now contains ungrouped 11683 1726853246.68561: Examining possible inventory source: /tmp/network-iHm/inventory.yml 11683 1726853246.77250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11683 1726853246.77294: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11683 1726853246.77311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11683 1726853246.77349: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11683 1726853246.77401: Loaded config def from plugin (inventory/script) 11683 1726853246.77402: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11683 1726853246.77430: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11683 1726853246.77487: Loaded config def from plugin (inventory/yaml) 11683 1726853246.77488: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11683 1726853246.77547: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11683 1726853246.77824: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11683 1726853246.77827: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11683 1726853246.77829: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11683 1726853246.77834: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11683 1726853246.77837: Loading data from /tmp/network-iHm/inventory.yml 11683 1726853246.77879: /tmp/network-iHm/inventory.yml was not parsable by auto 11683 1726853246.77920: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11683 1726853246.77951: Loading data from /tmp/network-iHm/inventory.yml 11683 1726853246.78004: group all already in inventory 11683 1726853246.78009: set inventory_file for managed_node1 11683 1726853246.78012: set inventory_dir for managed_node1 11683 1726853246.78012: Added host managed_node1 to inventory 11683 1726853246.78014: Added host managed_node1 to group all 11683 1726853246.78014: set ansible_host for managed_node1 11683 1726853246.78015: set ansible_ssh_extra_args for managed_node1 11683 1726853246.78017: set inventory_file for managed_node2 11683 1726853246.78018: set inventory_dir for managed_node2 11683 1726853246.78019: Added host managed_node2 to inventory 11683 1726853246.78020: Added host managed_node2 to group all 11683 1726853246.78020: set ansible_host for managed_node2 11683 1726853246.78021: set ansible_ssh_extra_args for managed_node2 11683 1726853246.78022: set inventory_file for managed_node3 11683 1726853246.78024: set inventory_dir for managed_node3 11683 1726853246.78024: Added host managed_node3 to inventory 11683 1726853246.78025: Added host managed_node3 to group all 11683 1726853246.78025: set ansible_host for managed_node3 11683 1726853246.78026: set ansible_ssh_extra_args for managed_node3 11683 1726853246.78027: Reconcile groups and hosts in inventory. 11683 1726853246.78030: Group ungrouped now contains managed_node1 11683 1726853246.78031: Group ungrouped now contains managed_node2 11683 1726853246.78032: Group ungrouped now contains managed_node3 11683 1726853246.78087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11683 1726853246.78166: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11683 1726853246.78196: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11683 1726853246.78213: Loaded config def from plugin (vars/host_group_vars) 11683 1726853246.78215: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11683 1726853246.78219: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11683 1726853246.78225: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11683 1726853246.78253: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11683 1726853246.78489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853246.78552: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11683 1726853246.78577: Loaded config def from plugin (connection/local) 11683 1726853246.78579: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11683 1726853246.78972: Loaded config def from plugin (connection/paramiko_ssh) 11683 1726853246.78975: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11683 1726853246.79539: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11683 1726853246.79567: Loaded config def from plugin (connection/psrp) 11683 1726853246.79570: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11683 1726853246.79969: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11683 1726853246.79993: Loaded config def from plugin (connection/ssh) 11683 1726853246.79995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11683 1726853246.81284: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11683 1726853246.81307: Loaded config def from plugin (connection/winrm) 11683 1726853246.81310: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11683 1726853246.81331: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11683 1726853246.81377: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11683 1726853246.81415: Loaded config def from plugin (shell/cmd) 11683 1726853246.81417: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11683 1726853246.81436: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11683 1726853246.81475: Loaded config def from plugin (shell/powershell) 11683 1726853246.81477: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11683 1726853246.81511: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11683 1726853246.81614: Loaded config def from plugin (shell/sh) 11683 1726853246.81616: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11683 1726853246.81640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11683 1726853246.81712: Loaded config def from plugin (become/runas) 11683 1726853246.81713: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11683 1726853246.81822: Loaded config def from plugin (become/su) 11683 1726853246.81824: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11683 1726853246.81918: Loaded config def from plugin (become/sudo) 11683 1726853246.81920: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11683 1726853246.81941: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11683 1726853246.82158: in VariableManager get_vars() 11683 1726853246.82174: done with get_vars() 11683 1726853246.82261: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11683 1726853246.84159: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11683 1726853246.84226: in VariableManager get_vars() 11683 1726853246.84229: done with get_vars() 11683 1726853246.84231: variable 'playbook_dir' from source: magic vars 11683 1726853246.84232: variable 'ansible_playbook_python' from source: magic vars 11683 1726853246.84232: variable 'ansible_config_file' from source: magic vars 11683 1726853246.84233: variable 'groups' from source: magic vars 11683 1726853246.84233: variable 'omit' from source: magic vars 11683 1726853246.84234: variable 'ansible_version' from source: magic vars 11683 1726853246.84234: variable 'ansible_check_mode' from source: magic vars 11683 1726853246.84235: variable 'ansible_diff_mode' from source: magic vars 11683 1726853246.84235: variable 'ansible_forks' from source: magic vars 11683 1726853246.84235: variable 'ansible_inventory_sources' from source: magic vars 11683 1726853246.84236: variable 'ansible_skip_tags' from source: magic vars 11683 1726853246.84236: variable 'ansible_limit' from source: magic vars 11683 1726853246.84237: variable 'ansible_run_tags' from source: magic vars 11683 1726853246.84237: variable 'ansible_verbosity' from source: magic vars 11683 1726853246.84264: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 11683 1726853246.84655: in VariableManager get_vars() 11683 1726853246.84666: done with get_vars() 11683 1726853246.84673: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11683 1726853246.85263: in VariableManager get_vars() 11683 1726853246.85274: done with get_vars() 11683 1726853246.85280: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11683 1726853246.85343: in VariableManager get_vars() 11683 1726853246.85356: done with get_vars() 11683 1726853246.85451: in VariableManager get_vars() 11683 1726853246.85461: done with get_vars() 11683 1726853246.85467: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11683 1726853246.85511: in VariableManager get_vars() 11683 1726853246.85520: done with get_vars() 11683 1726853246.85705: in VariableManager get_vars() 11683 1726853246.85714: done with get_vars() 11683 1726853246.85717: variable 'omit' from source: magic vars 11683 1726853246.85728: variable 'omit' from source: magic vars 11683 1726853246.85748: in VariableManager get_vars() 11683 1726853246.85755: done with get_vars() 11683 1726853246.85789: in VariableManager get_vars() 11683 1726853246.85798: done with get_vars() 11683 1726853246.85820: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11683 1726853246.85948: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11683 1726853246.86026: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11683 1726853246.86390: in VariableManager get_vars() 11683 1726853246.86402: done with get_vars() 11683 1726853246.86677: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 11683 1726853246.86758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11683 1726853246.87758: in VariableManager get_vars() 11683 1726853246.87770: done with get_vars() 11683 1726853246.87778: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11683 1726853246.87892: in VariableManager get_vars() 11683 1726853246.87904: done with get_vars() 11683 1726853246.87983: in VariableManager get_vars() 11683 1726853246.87994: done with get_vars() 11683 1726853246.88177: in VariableManager get_vars() 11683 1726853246.88188: done with get_vars() 11683 1726853246.88191: variable 'omit' from source: magic vars 11683 1726853246.88206: variable 'omit' from source: magic vars 11683 1726853246.88229: in VariableManager get_vars() 11683 1726853246.88237: done with get_vars() 11683 1726853246.88252: in VariableManager get_vars() 11683 1726853246.88261: done with get_vars() 11683 1726853246.88284: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11683 1726853246.88343: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11683 1726853246.89554: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11683 1726853246.89777: in VariableManager get_vars() 11683 1726853246.89791: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11683 1726853246.91056: in VariableManager get_vars() 11683 1726853246.91069: done with get_vars() 11683 1726853246.91076: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11683 1726853246.91389: in VariableManager get_vars() 11683 1726853246.91401: done with get_vars() 11683 1726853246.91438: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11683 1726853246.91450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11683 1726853246.91605: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11683 1726853246.91699: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11683 1726853246.91701: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 11683 1726853246.91722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11683 1726853246.91737: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11683 1726853246.91839: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11683 1726853246.91879: Loaded config def from plugin (callback/default) 11683 1726853246.91881: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11683 1726853246.92618: Loaded config def from plugin (callback/junit) 11683 1726853246.92620: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11683 1726853246.92656: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11683 1726853246.92694: Loaded config def from plugin (callback/minimal) 11683 1726853246.92696: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11683 1726853246.92721: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11683 1726853246.92762: Loaded config def from plugin (callback/tree) 11683 1726853246.92764: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11683 1726853246.92848: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11683 1726853246.92849: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Qi7/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11683 1726853246.92873: in VariableManager get_vars() 11683 1726853246.92882: done with get_vars() 11683 1726853246.92885: in VariableManager get_vars() 11683 1726853246.92890: done with get_vars() 11683 1726853246.92893: variable 'omit' from source: magic vars 11683 1726853246.92914: in VariableManager get_vars() 11683 1726853246.92922: done with get_vars() 11683 1726853246.92934: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 11683 1726853246.93291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11683 1726853246.93339: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11683 1726853246.93364: getting the remaining hosts for this loop 11683 1726853246.93366: done getting the remaining hosts for this loop 11683 1726853246.93368: getting the next task for host managed_node3 11683 1726853246.93370: done getting next task for host managed_node3 11683 1726853246.93374: ^ task is: TASK: Gathering Facts 11683 1726853246.93375: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853246.93376: getting variables 11683 1726853246.93377: in VariableManager get_vars() 11683 1726853246.93383: Calling all_inventory to load vars for managed_node3 11683 1726853246.93385: Calling groups_inventory to load vars for managed_node3 11683 1726853246.93386: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853246.93394: Calling all_plugins_play to load vars for managed_node3 11683 1726853246.93403: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853246.93405: Calling groups_plugins_play to load vars for managed_node3 11683 1726853246.93427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853246.93462: done with get_vars() 11683 1726853246.93466: done getting variables 11683 1726853246.93510: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Friday 20 September 2024 13:27:26 -0400 (0:00:00.007) 0:00:00.007 ****** 11683 1726853246.93525: entering _queue_task() for managed_node3/gather_facts 11683 1726853246.93526: Creating lock for gather_facts 11683 1726853246.93810: worker is 1 (out of 1 available) 11683 1726853246.93822: exiting _queue_task() for managed_node3/gather_facts 11683 1726853246.93834: done queuing things up, now waiting for results queue to drain 11683 1726853246.93837: waiting for pending results... 11683 1726853246.93970: running TaskExecutor() for managed_node3/TASK: Gathering Facts 11683 1726853246.94026: in run() - task 02083763-bbaf-c5b2-e075-0000000000cc 11683 1726853246.94038: variable 'ansible_search_path' from source: unknown 11683 1726853246.94068: calling self._execute() 11683 1726853246.94118: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853246.94121: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853246.94130: variable 'omit' from source: magic vars 11683 1726853246.94198: variable 'omit' from source: magic vars 11683 1726853246.94218: variable 'omit' from source: magic vars 11683 1726853246.94243: variable 'omit' from source: magic vars 11683 1726853246.94280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853246.94307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853246.94324: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853246.94337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853246.94346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853246.94374: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853246.94377: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853246.94379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853246.94446: Set connection var ansible_shell_executable to /bin/sh 11683 1726853246.94457: Set connection var ansible_timeout to 10 11683 1726853246.94463: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853246.94468: Set connection var ansible_pipelining to False 11683 1726853246.94472: Set connection var ansible_shell_type to sh 11683 1726853246.94475: Set connection var ansible_connection to ssh 11683 1726853246.94491: variable 'ansible_shell_executable' from source: unknown 11683 1726853246.94494: variable 'ansible_connection' from source: unknown 11683 1726853246.94496: variable 'ansible_module_compression' from source: unknown 11683 1726853246.94499: variable 'ansible_shell_type' from source: unknown 11683 1726853246.94501: variable 'ansible_shell_executable' from source: unknown 11683 1726853246.94503: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853246.94508: variable 'ansible_pipelining' from source: unknown 11683 1726853246.94510: variable 'ansible_timeout' from source: unknown 11683 1726853246.94514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853246.94647: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853246.94654: variable 'omit' from source: magic vars 11683 1726853246.94659: starting attempt loop 11683 1726853246.94662: running the handler 11683 1726853246.94675: variable 'ansible_facts' from source: unknown 11683 1726853246.94690: _low_level_execute_command(): starting 11683 1726853246.94697: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853246.95215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853246.95219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853246.95222: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853246.95224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853246.95276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853246.95279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853246.95354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853246.97050: stdout chunk (state=3): >>>/root <<< 11683 1726853246.97155: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853246.97178: stderr chunk (state=3): >>><<< 11683 1726853246.97182: stdout chunk (state=3): >>><<< 11683 1726853246.97200: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853246.97210: _low_level_execute_command(): starting 11683 1726853246.97216: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584 `" && echo ansible-tmp-1726853246.9720013-11687-43992294107584="` echo /root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584 `" ) && sleep 0' 11683 1726853246.97642: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853246.97648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853246.97651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11683 1726853246.97653: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853246.97663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853246.97707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853246.97714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853246.97775: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853246.99741: stdout chunk (state=3): >>>ansible-tmp-1726853246.9720013-11687-43992294107584=/root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584 <<< 11683 1726853246.99849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853246.99874: stderr chunk (state=3): >>><<< 11683 1726853246.99878: stdout chunk (state=3): >>><<< 11683 1726853246.99893: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853246.9720013-11687-43992294107584=/root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853246.99919: variable 'ansible_module_compression' from source: unknown 11683 1726853246.99967: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11683 1726853246.99972: ANSIBALLZ: Acquiring lock 11683 1726853246.99975: ANSIBALLZ: Lock acquired: 139785061355968 11683 1726853246.99978: ANSIBALLZ: Creating module 11683 1726853247.21379: ANSIBALLZ: Writing module into payload 11683 1726853247.21383: ANSIBALLZ: Writing module 11683 1726853247.21385: ANSIBALLZ: Renaming module 11683 1726853247.21387: ANSIBALLZ: Done creating module 11683 1726853247.21416: variable 'ansible_facts' from source: unknown 11683 1726853247.21434: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853247.21449: _low_level_execute_command(): starting 11683 1726853247.21458: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11683 1726853247.22062: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853247.22077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853247.22091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853247.22107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853247.22122: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853247.22133: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853247.22145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853247.22164: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853247.22185: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853247.22196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11683 1726853247.22207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853247.22220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853247.22236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853247.22247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853247.22257: stderr chunk (state=3): >>>debug2: match found <<< 11683 1726853247.22270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853247.22343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853247.22368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853247.22468: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853247.24216: stdout chunk (state=3): >>>PLATFORM <<< 11683 1726853247.24491: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 11683 1726853247.24559: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853247.24562: stdout chunk (state=3): >>><<< 11683 1726853247.24564: stderr chunk (state=3): >>><<< 11683 1726853247.24586: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853247.24604 [managed_node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 11683 1726853247.24776: _low_level_execute_command(): starting 11683 1726853247.24780: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 11683 1726853247.25113: Sending initial data 11683 1726853247.25116: Sent initial data (1181 bytes) 11683 1726853247.25404: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853247.25416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11683 1726853247.25433: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853247.25479: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853247.25486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853247.25561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853247.29192: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11683 1726853247.29779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853247.29782: stdout chunk (state=3): >>><<< 11683 1726853247.29785: stderr chunk (state=3): >>><<< 11683 1726853247.29787: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853247.29789: variable 'ansible_facts' from source: unknown 11683 1726853247.29791: variable 'ansible_facts' from source: unknown 11683 1726853247.29793: variable 'ansible_module_compression' from source: unknown 11683 1726853247.29795: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11683 1726853247.29816: variable 'ansible_facts' from source: unknown 11683 1726853247.30240: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/AnsiballZ_setup.py 11683 1726853247.30712: Sending initial data 11683 1726853247.30716: Sent initial data (153 bytes) 11683 1726853247.31685: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853247.31904: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853247.31917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853247.32004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853247.33651: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853247.33862: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853247.33912: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpdpovfqys /root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/AnsiballZ_setup.py <<< 11683 1726853247.33926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/AnsiballZ_setup.py" <<< 11683 1726853247.33980: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpdpovfqys" to remote "/root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/AnsiballZ_setup.py" <<< 11683 1726853247.36645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853247.36909: stderr chunk (state=3): >>><<< 11683 1726853247.36912: stdout chunk (state=3): >>><<< 11683 1726853247.36915: done transferring module to remote 11683 1726853247.36917: _low_level_execute_command(): starting 11683 1726853247.36919: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/ /root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/AnsiballZ_setup.py && sleep 0' 11683 1726853247.38025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853247.38049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853247.38073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853247.38094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853247.38245: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853247.38396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853247.38480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853247.40355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853247.40381: stderr chunk (state=3): >>><<< 11683 1726853247.40389: stdout chunk (state=3): >>><<< 11683 1726853247.40407: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853247.40419: _low_level_execute_command(): starting 11683 1726853247.40428: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/AnsiballZ_setup.py && sleep 0' 11683 1726853247.41126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853247.41141: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853247.41156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853247.41221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853247.41263: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853247.41282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853247.41307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853247.41412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853247.43691: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11683 1726853247.43712: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 11683 1726853247.43767: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11683 1726853247.43807: stdout chunk (state=3): >>>import 'posix' # <<< 11683 1726853247.43920: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 11683 1726853247.43949: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 11683 1726853247.43977: stdout chunk (state=3): >>>import 'codecs' # <<< 11683 1726853247.44020: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11683 1726853247.44062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11683 1726853247.44066: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe02184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe01e7b30> <<< 11683 1726853247.44092: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe021aa50> <<< 11683 1726853247.44125: stdout chunk (state=3): >>>import '_signal' # <<< 11683 1726853247.44173: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 11683 1726853247.44190: stdout chunk (state=3): >>>import 'io' # <<< 11683 1726853247.44201: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11683 1726853247.44287: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11683 1726853247.44315: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11683 1726853247.44352: stdout chunk (state=3): >>>import 'os' # <<< 11683 1726853247.44406: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages <<< 11683 1726853247.44450: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11683 1726853247.44453: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11683 1726853247.44470: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdffc9130> <<< 11683 1726853247.44528: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11683 1726853247.44539: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdffc9fa0> <<< 11683 1726853247.44563: stdout chunk (state=3): >>>import 'site' # <<< 11683 1726853247.44599: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11683 1726853247.44976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11683 1726853247.45021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11683 1726853247.45030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853247.45042: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11683 1726853247.45110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11683 1726853247.45116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11683 1726853247.45226: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0007dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11683 1726853247.45254: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0007fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11683 1726853247.45281: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11683 1726853247.45335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853247.45479: stdout chunk (state=3): >>>import 'itertools' # <<< 11683 1726853247.45509: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe003f800> <<< 11683 1726853247.45512: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe003fe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe001faa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe001d1c0> <<< 11683 1726853247.45659: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0004f80> <<< 11683 1726853247.45696: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11683 1726853247.45885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11683 1726853247.45899: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe005f6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe005e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe001e060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0006e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00947a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0004200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11683 1726853247.45935: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe0094c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0094b00> <<< 11683 1726853247.45986: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe0094ef0> <<< 11683 1726853247.46031: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0002d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853247.46182: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11683 1726853247.46217: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00955b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0095280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00964b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11683 1726853247.46234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11683 1726853247.46270: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00ac680> <<< 11683 1726853247.46385: stdout chunk (state=3): >>>import 'errno' # <<< 11683 1726853247.46397: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe00add30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00aebd0> <<< 11683 1726853247.46496: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe00af230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00ae120> <<< 11683 1726853247.46546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe00afcb0> <<< 11683 1726853247.46550: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00af3e0> <<< 11683 1726853247.46592: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0096450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11683 1726853247.46678: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11683 1726853247.46686: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11683 1726853247.46748: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11683 1726853247.46752: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfda7b90> <<< 11683 1726853247.46785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfdd0650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd03b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfdd0680> <<< 11683 1726853247.46898: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11683 1726853247.46904: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853247.47023: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfdd0fb0> <<< 11683 1726853247.47159: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfdd1910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd0860> <<< 11683 1726853247.47197: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfda5d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11683 1726853247.47233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11683 1726853247.47334: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd2cc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd17f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0096ba0> <<< 11683 1726853247.47337: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11683 1726853247.47410: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853247.47413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11683 1726853247.47479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11683 1726853247.47482: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdff020> <<< 11683 1726853247.47693: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11683 1726853247.47711: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853247.47725: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe23410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11683 1726853247.47746: stdout chunk (state=3): >>>import 'ntpath' # <<< 11683 1726853247.47793: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe801a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11683 1726853247.47858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11683 1726853247.47861: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11683 1726853247.47953: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11683 1726853247.48047: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe82900> <<< 11683 1726853247.48160: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe802c0> <<< 11683 1726853247.48165: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe4d1c0> <<< 11683 1726853247.48208: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf72d2e0> <<< 11683 1726853247.48212: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe22210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd3bf0> <<< 11683 1726853247.48332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11683 1726853247.48362: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7fdfe22570> <<< 11683 1726853247.48735: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xoeyy_38/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 11683 1726853247.48784: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.48853: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11683 1726853247.48957: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11683 1726853247.48961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11683 1726853247.49003: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf78f050> import '_typing' # <<< 11683 1726853247.49159: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf76df40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf76d0a0> <<< 11683 1726853247.49178: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.49211: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 11683 1726853247.49240: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.49262: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 11683 1726853247.49275: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.50794: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.51901: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf78cf20> <<< 11683 1726853247.52140: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf7be9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf7be750> <<< 11683 1726853247.52206: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf7be060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf7be930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd0440> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf7bf6b0> <<< 11683 1726853247.52225: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf7bf830> <<< 11683 1726853247.52307: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11683 1726853247.52353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 11683 1726853247.52410: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf7bfd70> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11683 1726853247.52590: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf629b20> <<< 11683 1726853247.52595: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf62b710> <<< 11683 1726853247.52621: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62c0e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11683 1726853247.52641: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62d250> <<< 11683 1726853247.52654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11683 1726853247.52694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11683 1726853247.52720: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11683 1726853247.52761: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62fd10> <<< 11683 1726853247.52831: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfe4fdd0> <<< 11683 1726853247.52849: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62e000> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11683 1726853247.52928: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11683 1726853247.52947: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11683 1726853247.53084: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11683 1726853247.53106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11683 1726853247.53109: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf637b90> import '_tokenize' # <<< 11683 1726853247.53192: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf636660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf6363c0> <<< 11683 1726853247.53289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 11683 1726853247.53292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11683 1726853247.53294: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf636930> <<< 11683 1726853247.53327: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62e4e0> <<< 11683 1726853247.53355: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf67bd70> <<< 11683 1726853247.53481: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf67bfe0> <<< 11683 1726853247.53505: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf67d9d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf67d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11683 1726853247.53526: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11683 1726853247.53582: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf67ff20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf67e090> <<< 11683 1726853247.53611: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11683 1726853247.53651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853247.53675: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 11683 1726853247.53718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11683 1726853247.53743: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf683740> <<< 11683 1726853247.53936: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf680110> <<< 11683 1726853247.53968: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf684500> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf684950> <<< 11683 1726853247.54053: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf684a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf67c110> <<< 11683 1726853247.54081: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11683 1726853247.54104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11683 1726853247.54136: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853247.54408: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf510170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf5115b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf686900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf687cb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf686540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11683 1726853247.54425: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.54518: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.54605: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.54646: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 11683 1726853247.54669: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11683 1726853247.54792: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.54915: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.55491: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.56148: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 11683 1726853247.56168: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf5157c0> <<< 11683 1726853247.56235: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11683 1726853247.56258: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5165d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5117f0> <<< 11683 1726853247.56308: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11683 1726853247.56338: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.56368: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11683 1726853247.56512: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.56790: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf516360> <<< 11683 1726853247.56794: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.57158: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.57600: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.57687: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.57760: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11683 1726853247.57768: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.57793: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.57835: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 11683 1726853247.57908: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.58005: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11683 1726853247.58028: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11683 1726853247.58072: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.58117: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11683 1726853247.58129: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.58351: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.58583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11683 1726853247.58696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 11683 1726853247.58734: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf517830> <<< 11683 1726853247.58757: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.58814: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.58901: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11683 1726853247.59038: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 11683 1726853247.59058: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.59107: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.59162: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.59346: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11683 1726853247.59376: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf522030> <<< 11683 1726853247.59415: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf51f3b0> <<< 11683 1726853247.59448: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 11683 1726853247.59481: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.59598: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.59615: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.59691: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853247.59695: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11683 1726853247.59796: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11683 1726853247.59799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11683 1726853247.59830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 11683 1726853247.59841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11683 1726853247.59893: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf60aae0> <<< 11683 1726853247.59933: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf6fe7b0> <<< 11683 1726853247.60028: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf522210> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf521df0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11683 1726853247.60039: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60194: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60197: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 11683 1726853247.60204: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60261: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60322: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60355: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60367: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60494: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60497: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.60520: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 11683 1726853247.60602: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60675: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60804: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.60807: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 11683 1726853247.60929: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.61105: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.61160: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.61316: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b6510> <<< 11683 1726853247.61358: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11683 1726853247.61361: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11683 1726853247.61419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11683 1726853247.61441: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 11683 1726853247.61459: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf1d4170> <<< 11683 1726853247.61521: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853247.61524: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf1d44a0> <<< 11683 1726853247.61553: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5a6c30> <<< 11683 1726853247.61596: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b6ff0> <<< 11683 1726853247.61731: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b4bf0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b4860> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11683 1726853247.61758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11683 1726853247.61789: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf1d7440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf1d6cf0> <<< 11683 1726853247.61816: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf1d6ed0> <<< 11683 1726853247.61853: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf1d6120> <<< 11683 1726853247.61857: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11683 1726853247.62110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf1d75c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf2360c0> <<< 11683 1726853247.62113: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf2340e0> <<< 11683 1726853247.62157: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b48c0> import 'ansible.module_utils.facts.timeout' # <<< 11683 1726853247.62199: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 11683 1726853247.62202: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 11683 1726853247.62274: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.62332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11683 1726853247.62347: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.62400: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.62515: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 11683 1726853247.62535: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.62555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 11683 1726853247.62637: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.62669: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 11683 1726853247.62755: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.62768: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 11683 1726853247.62846: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.63132: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.63136: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11683 1726853247.63508: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.63939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11683 1726853247.63960: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64083: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.64099: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64134: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 11683 1726853247.64146: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64176: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64201: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 11683 1726853247.64215: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64268: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64413: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 11683 1726853247.64416: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64435: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 11683 1726853247.64450: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64508: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 11683 1726853247.64578: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64747: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf237980> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11683 1726853247.64869: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf236d80> import 'ansible.module_utils.facts.system.local' # <<< 11683 1726853247.64874: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64939: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.64999: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11683 1726853247.65019: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.65101: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.65193: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 11683 1726853247.65284: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.65306: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.65335: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11683 1726853247.65373: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.65398: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.65443: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11683 1726853247.65498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11683 1726853247.65567: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853247.65640: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf272390> <<< 11683 1726853247.65834: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf2621b0> import 'ansible.module_utils.facts.system.python' # <<< 11683 1726853247.65838: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.65889: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.65951: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 11683 1726853247.66039: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.66117: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.66234: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.66384: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 11683 1726853247.66401: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 11683 1726853247.66426: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.66468: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 11683 1726853247.66519: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.66573: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 11683 1726853247.66577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11683 1726853247.66600: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853247.66653: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf285d90> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf2859d0> <<< 11683 1726853247.66678: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 11683 1726853247.66717: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.66764: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 11683 1726853247.66923: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.67087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11683 1726853247.67090: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.67211: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.67310: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.67357: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.67636: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 11683 1726853247.67639: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.67694: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.67716: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 11683 1726853247.67835: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.67950: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11683 1726853247.67968: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.68004: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.68026: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.68727: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.69202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 11683 1726853247.69296: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.69335: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 11683 1726853247.69437: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.69550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 11683 1726853247.69763: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.69888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 11683 1726853247.69957: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.69990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 11683 1726853247.70003: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.70107: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.70196: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.70388: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.70605: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 11683 1726853247.70608: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.70691: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.70704: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 11683 1726853247.70747: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 11683 1726853247.70782: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.70827: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.70933: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11683 1726853247.70937: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.70989: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 11683 1726853247.71019: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.71076: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 11683 1726853247.71139: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.71200: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 11683 1726853247.71468: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.71728: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 11683 1726853247.71795: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.71863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11683 1726853247.71880: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.71903: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.71933: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 11683 1726853247.71973: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72182: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 11683 1726853247.72201: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.72250: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11683 1726853247.72279: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 11683 1726853247.72289: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72329: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72364: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 11683 1726853247.72392: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72419: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72429: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72468: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72527: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72587: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 11683 1726853247.72680: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 11683 1726853247.72740: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.72845: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11683 1726853247.73005: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.73191: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 11683 1726853247.73241: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.73296: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 11683 1726853247.73698: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853247.73749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 11683 1726853247.73762: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11683 1726853247.73831: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853247.74076: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11683 1726853247.74094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11683 1726853247.74152: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf0827b0> <<< 11683 1726853247.74155: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf082c60> <<< 11683 1726853247.74191: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf0792e0> <<< 11683 1726853247.85650: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 11683 1726853247.85693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf083f20> <<< 11683 1726853247.85723: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 11683 1726853247.85748: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf0c8980> <<< 11683 1726853247.85803: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853247.85849: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf0ca060> <<< 11683 1726853247.85886: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf0c9af0> <<< 11683 1726853247.86136: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11683 1726853248.10246: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "27", "epoch": "1726853247", "epoch_int": "1726853247", "date": "2024-09-20", "time": "13:27:27", "iso8601_micro": "2024-09-20T17:27:27.747308Z", "iso8601": "2024-09-20T17:27:27Z", "iso8601_basic": "20240920T132727747308", "iso8601_basic_short": "20240920T132727", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2988, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 543, "free": 2988}, "nocache": {"free": 3302, "used": 229}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", <<< 11683 1726853248.10257: stdout chunk (state=3): >>>"ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 392, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805977600, "block_size": 4096, "block_total": 65519099, "block_available": 63917475, "block_used": 1601624, "inode_total": 131070960, "inode_available": 131029154, "inode_used": 41806, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.5361328125, "5m": 0.30810546875, "15m": 0.14501953125}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_<<< 11683 1726853248.10297: stdout chunk (state=3): >>>mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11683 1726853248.10863: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 11683 1726853248.10933: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external<<< 11683 1726853248.10941: stdout chunk (state=3): >>> # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types <<< 11683 1726853248.10991: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder <<< 11683 1726853248.11037: stdout chunk (state=3): >>># cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters <<< 11683 1726853248.11040: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse <<< 11683 1726853248.11121: stdout chunk (state=3): >>># cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly <<< 11683 1726853248.11156: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware <<< 11683 1726853248.11167: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11683 1726853248.11537: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11683 1726853248.11570: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11683 1726853248.11592: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 11683 1726853248.11669: stdout chunk (state=3): >>># destroy ntpath <<< 11683 1726853248.11679: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 11683 1726853248.11704: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 11683 1726853248.11756: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 11683 1726853248.11768: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11683 1726853248.11831: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 11683 1726853248.11872: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 11683 1726853248.11906: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess <<< 11683 1726853248.11961: stdout chunk (state=3): >>># destroy base64 # destroy _ssl <<< 11683 1726853248.12010: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 11683 1726853248.12014: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct <<< 11683 1726853248.12038: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 11683 1726853248.12095: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep<<< 11683 1726853248.12155: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 11683 1726853248.12195: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 11683 1726853248.12219: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig <<< 11683 1726853248.12269: stdout chunk (state=3): >>># cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11683 1726853248.12286: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11683 1726853248.12448: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11683 1726853248.12490: stdout chunk (state=3): >>># destroy _collections <<< 11683 1726853248.12512: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 11683 1726853248.12532: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11683 1726853248.12570: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 11683 1726853248.12605: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 11683 1726853248.12619: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11683 1726853248.12722: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs <<< 11683 1726853248.12767: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 11683 1726853248.12780: stdout chunk (state=3): >>># destroy _hashlib <<< 11683 1726853248.12821: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 11683 1726853248.12841: stdout chunk (state=3): >>># clear sys.audit hooks <<< 11683 1726853248.13243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853248.13281: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 11683 1726853248.13284: stdout chunk (state=3): >>><<< 11683 1726853248.13286: stderr chunk (state=3): >>><<< 11683 1726853248.13585: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe02184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe01e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe021aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdffc9130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdffc9fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0007dd0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0007fe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe003f800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe003fe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe001faa0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe001d1c0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0004f80> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe005f6e0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe005e300> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe001e060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0006e70> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00947a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0004200> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe0094c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0094b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe0094ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0002d20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00955b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0095280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00964b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00ac680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe00add30> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00aebd0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe00af230> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00ae120> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fe00afcb0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe00af3e0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0096450> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfda7b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfdd0650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd03b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfdd0680> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfdd0fb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfdd1910> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd0860> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfda5d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd2cc0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd17f0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fe0096ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdff020> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe23410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe801a0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe82900> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe802c0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe4d1c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf72d2e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfe22210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd3bf0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7fdfe22570> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_xoeyy_38/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf78f050> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf76df40> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf76d0a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf78cf20> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf7be9c0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf7be750> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf7be060> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf7be930> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdfdd0440> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf7bf6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf7bf830> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf7bfd70> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf629b20> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf62b710> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62c0e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62d250> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62fd10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdfe4fdd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62e000> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf637b90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf636660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf6363c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf636930> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf62e4e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf67bd70> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf67bfe0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf67d9d0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf67d7c0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf67ff20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf67e090> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf683740> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf680110> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf684500> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf684950> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf684a10> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf67c110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf510170> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf5115b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf686900> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf687cb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf686540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf5157c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5165d0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5117f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf516360> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf517830> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf522030> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf51f3b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf60aae0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf6fe7b0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf522210> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf521df0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b6510> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf1d4170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf1d44a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5a6c30> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b6ff0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b4bf0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b4860> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf1d7440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf1d6cf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf1d6ed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf1d6120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf1d75c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf2360c0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf2340e0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf5b48c0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf237980> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf236d80> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf272390> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf2621b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf285d90> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf2859d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7fdf0827b0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf082c60> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf0792e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf083f20> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf0c8980> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf0ca060> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7fdf0c9af0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "27", "epoch": "1726853247", "epoch_int": "1726853247", "date": "2024-09-20", "time": "13:27:27", "iso8601_micro": "2024-09-20T17:27:27.747308Z", "iso8601": "2024-09-20T17:27:27Z", "iso8601_basic": "20240920T132727747308", "iso8601_basic_short": "20240920T132727", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2988, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 543, "free": 2988}, "nocache": {"free": 3302, "used": 229}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 392, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805977600, "block_size": 4096, "block_total": 65519099, "block_available": 63917475, "block_used": 1601624, "inode_total": 131070960, "inode_available": 131029154, "inode_used": 41806, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_is_chroot": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.5361328125, "5m": 0.30810546875, "15m": 0.14501953125}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11683 1726853248.15635: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853248.15672: _low_level_execute_command(): starting 11683 1726853248.15689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853246.9720013-11687-43992294107584/ > /dev/null 2>&1 && sleep 0' 11683 1726853248.16464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853248.16481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853248.16497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853248.16516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853248.16603: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853248.18558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853248.18692: stdout chunk (state=3): >>><<< 11683 1726853248.18702: stderr chunk (state=3): >>><<< 11683 1726853248.18764: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853248.18840: handler run complete 11683 1726853248.19004: variable 'ansible_facts' from source: unknown 11683 1726853248.19155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853248.19451: variable 'ansible_facts' from source: unknown 11683 1726853248.19539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853248.19675: attempt loop complete, returning result 11683 1726853248.19686: _execute() done 11683 1726853248.19697: dumping result to json 11683 1726853248.19730: done dumping result, returning 11683 1726853248.19802: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-c5b2-e075-0000000000cc] 11683 1726853248.19806: sending task result for task 02083763-bbaf-c5b2-e075-0000000000cc 11683 1726853248.20677: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000cc 11683 1726853248.20680: WORKER PROCESS EXITING ok: [managed_node3] 11683 1726853248.21123: no more pending results, returning what we have 11683 1726853248.21126: results queue empty 11683 1726853248.21127: checking for any_errors_fatal 11683 1726853248.21128: done checking for any_errors_fatal 11683 1726853248.21129: checking for max_fail_percentage 11683 1726853248.21131: done checking for max_fail_percentage 11683 1726853248.21132: checking to see if all hosts have failed and the running result is not ok 11683 1726853248.21133: done checking to see if all hosts have failed 11683 1726853248.21134: getting the remaining hosts for this loop 11683 1726853248.21136: done getting the remaining hosts for this loop 11683 1726853248.21140: getting the next task for host managed_node3 11683 1726853248.21149: done getting next task for host managed_node3 11683 1726853248.21151: ^ task is: TASK: meta (flush_handlers) 11683 1726853248.21153: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853248.21157: getting variables 11683 1726853248.21159: in VariableManager get_vars() 11683 1726853248.21388: Calling all_inventory to load vars for managed_node3 11683 1726853248.21391: Calling groups_inventory to load vars for managed_node3 11683 1726853248.21394: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853248.21405: Calling all_plugins_play to load vars for managed_node3 11683 1726853248.21408: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853248.21411: Calling groups_plugins_play to load vars for managed_node3 11683 1726853248.21787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853248.21951: done with get_vars() 11683 1726853248.21962: done getting variables 11683 1726853248.22228: in VariableManager get_vars() 11683 1726853248.22237: Calling all_inventory to load vars for managed_node3 11683 1726853248.22239: Calling groups_inventory to load vars for managed_node3 11683 1726853248.22241: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853248.22248: Calling all_plugins_play to load vars for managed_node3 11683 1726853248.22250: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853248.22253: Calling groups_plugins_play to load vars for managed_node3 11683 1726853248.22580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853248.22972: done with get_vars() 11683 1726853248.22988: done queuing things up, now waiting for results queue to drain 11683 1726853248.22991: results queue empty 11683 1726853248.22992: checking for any_errors_fatal 11683 1726853248.22994: done checking for any_errors_fatal 11683 1726853248.22995: checking for max_fail_percentage 11683 1726853248.22996: done checking for max_fail_percentage 11683 1726853248.22997: checking to see if all hosts have failed and the running result is not ok 11683 1726853248.22998: done checking to see if all hosts have failed 11683 1726853248.23003: getting the remaining hosts for this loop 11683 1726853248.23004: done getting the remaining hosts for this loop 11683 1726853248.23007: getting the next task for host managed_node3 11683 1726853248.23012: done getting next task for host managed_node3 11683 1726853248.23015: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11683 1726853248.23016: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853248.23018: getting variables 11683 1726853248.23019: in VariableManager get_vars() 11683 1726853248.23029: Calling all_inventory to load vars for managed_node3 11683 1726853248.23031: Calling groups_inventory to load vars for managed_node3 11683 1726853248.23034: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853248.23038: Calling all_plugins_play to load vars for managed_node3 11683 1726853248.23040: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853248.23043: Calling groups_plugins_play to load vars for managed_node3 11683 1726853248.23397: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853248.23778: done with get_vars() 11683 1726853248.23786: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Friday 20 September 2024 13:27:28 -0400 (0:00:01.303) 0:00:01.310 ****** 11683 1726853248.23859: entering _queue_task() for managed_node3/include_tasks 11683 1726853248.23861: Creating lock for include_tasks 11683 1726853248.24616: worker is 1 (out of 1 available) 11683 1726853248.24628: exiting _queue_task() for managed_node3/include_tasks 11683 1726853248.24639: done queuing things up, now waiting for results queue to drain 11683 1726853248.24641: waiting for pending results... 11683 1726853248.24861: running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' 11683 1726853248.25039: in run() - task 02083763-bbaf-c5b2-e075-000000000006 11683 1726853248.25053: variable 'ansible_search_path' from source: unknown 11683 1726853248.25277: calling self._execute() 11683 1726853248.25281: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853248.25283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853248.25286: variable 'omit' from source: magic vars 11683 1726853248.25569: _execute() done 11683 1726853248.25582: dumping result to json 11683 1726853248.25590: done dumping result, returning 11683 1726853248.25601: done running TaskExecutor() for managed_node3/TASK: Include the task 'el_repo_setup.yml' [02083763-bbaf-c5b2-e075-000000000006] 11683 1726853248.25611: sending task result for task 02083763-bbaf-c5b2-e075-000000000006 11683 1726853248.25846: done sending task result for task 02083763-bbaf-c5b2-e075-000000000006 11683 1726853248.25850: WORKER PROCESS EXITING 11683 1726853248.25900: no more pending results, returning what we have 11683 1726853248.25906: in VariableManager get_vars() 11683 1726853248.25937: Calling all_inventory to load vars for managed_node3 11683 1726853248.25939: Calling groups_inventory to load vars for managed_node3 11683 1726853248.25942: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853248.25958: Calling all_plugins_play to load vars for managed_node3 11683 1726853248.25961: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853248.25963: Calling groups_plugins_play to load vars for managed_node3 11683 1726853248.26324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853248.26718: done with get_vars() 11683 1726853248.26727: variable 'ansible_search_path' from source: unknown 11683 1726853248.26741: we have included files to process 11683 1726853248.26742: generating all_blocks data 11683 1726853248.26743: done generating all_blocks data 11683 1726853248.26747: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11683 1726853248.26748: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11683 1726853248.26752: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11683 1726853248.27389: in VariableManager get_vars() 11683 1726853248.27402: done with get_vars() 11683 1726853248.27410: done processing included file 11683 1726853248.27411: iterating over new_blocks loaded from include file 11683 1726853248.27412: in VariableManager get_vars() 11683 1726853248.27418: done with get_vars() 11683 1726853248.27419: filtering new block on tags 11683 1726853248.27428: done filtering new block on tags 11683 1726853248.27430: in VariableManager get_vars() 11683 1726853248.27436: done with get_vars() 11683 1726853248.27437: filtering new block on tags 11683 1726853248.27448: done filtering new block on tags 11683 1726853248.27450: in VariableManager get_vars() 11683 1726853248.27457: done with get_vars() 11683 1726853248.27457: filtering new block on tags 11683 1726853248.27465: done filtering new block on tags 11683 1726853248.27466: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node3 11683 1726853248.27472: extending task lists for all hosts with included blocks 11683 1726853248.27499: done extending task lists 11683 1726853248.27500: done processing included files 11683 1726853248.27500: results queue empty 11683 1726853248.27501: checking for any_errors_fatal 11683 1726853248.27502: done checking for any_errors_fatal 11683 1726853248.27502: checking for max_fail_percentage 11683 1726853248.27503: done checking for max_fail_percentage 11683 1726853248.27504: checking to see if all hosts have failed and the running result is not ok 11683 1726853248.27504: done checking to see if all hosts have failed 11683 1726853248.27505: getting the remaining hosts for this loop 11683 1726853248.27506: done getting the remaining hosts for this loop 11683 1726853248.27508: getting the next task for host managed_node3 11683 1726853248.27511: done getting next task for host managed_node3 11683 1726853248.27518: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11683 1726853248.27520: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853248.27522: getting variables 11683 1726853248.27523: in VariableManager get_vars() 11683 1726853248.27529: Calling all_inventory to load vars for managed_node3 11683 1726853248.27531: Calling groups_inventory to load vars for managed_node3 11683 1726853248.27532: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853248.27536: Calling all_plugins_play to load vars for managed_node3 11683 1726853248.27537: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853248.27539: Calling groups_plugins_play to load vars for managed_node3 11683 1726853248.27635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853248.27740: done with get_vars() 11683 1726853248.27748: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 13:27:28 -0400 (0:00:00.039) 0:00:01.349 ****** 11683 1726853248.27792: entering _queue_task() for managed_node3/setup 11683 1726853248.28014: worker is 1 (out of 1 available) 11683 1726853248.28026: exiting _queue_task() for managed_node3/setup 11683 1726853248.28036: done queuing things up, now waiting for results queue to drain 11683 1726853248.28038: waiting for pending results... 11683 1726853248.28185: running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 11683 1726853248.28249: in run() - task 02083763-bbaf-c5b2-e075-0000000000dd 11683 1726853248.28260: variable 'ansible_search_path' from source: unknown 11683 1726853248.28263: variable 'ansible_search_path' from source: unknown 11683 1726853248.28291: calling self._execute() 11683 1726853248.28339: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853248.28343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853248.28352: variable 'omit' from source: magic vars 11683 1726853248.28977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853248.31049: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853248.31118: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853248.31157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853248.31208: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853248.31237: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853248.31315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853248.31348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853248.31380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853248.31424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853248.31442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853248.31608: variable 'ansible_facts' from source: unknown 11683 1726853248.31677: variable 'network_test_required_facts' from source: task vars 11683 1726853248.31716: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11683 1726853248.31726: variable 'omit' from source: magic vars 11683 1726853248.31765: variable 'omit' from source: magic vars 11683 1726853248.31804: variable 'omit' from source: magic vars 11683 1726853248.31834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853248.31863: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853248.31887: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853248.31909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853248.31923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853248.31955: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853248.31963: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853248.31970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853248.32065: Set connection var ansible_shell_executable to /bin/sh 11683 1726853248.32083: Set connection var ansible_timeout to 10 11683 1726853248.32096: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853248.32105: Set connection var ansible_pipelining to False 11683 1726853248.32112: Set connection var ansible_shell_type to sh 11683 1726853248.32118: Set connection var ansible_connection to ssh 11683 1726853248.32143: variable 'ansible_shell_executable' from source: unknown 11683 1726853248.32151: variable 'ansible_connection' from source: unknown 11683 1726853248.32157: variable 'ansible_module_compression' from source: unknown 11683 1726853248.32276: variable 'ansible_shell_type' from source: unknown 11683 1726853248.32280: variable 'ansible_shell_executable' from source: unknown 11683 1726853248.32282: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853248.32285: variable 'ansible_pipelining' from source: unknown 11683 1726853248.32287: variable 'ansible_timeout' from source: unknown 11683 1726853248.32289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853248.32326: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853248.32341: variable 'omit' from source: magic vars 11683 1726853248.32350: starting attempt loop 11683 1726853248.32356: running the handler 11683 1726853248.32374: _low_level_execute_command(): starting 11683 1726853248.32385: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853248.33061: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853248.33085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853248.33099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853248.33116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853248.33132: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853248.33223: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853248.33247: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853248.33335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853248.34962: stdout chunk (state=3): >>>/root <<< 11683 1726853248.35141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853248.35153: stdout chunk (state=3): >>><<< 11683 1726853248.35166: stderr chunk (state=3): >>><<< 11683 1726853248.35487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853248.35498: _low_level_execute_command(): starting 11683 1726853248.35501: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249 `" && echo ansible-tmp-1726853248.3539767-11741-268282284571249="` echo /root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249 `" ) && sleep 0' 11683 1726853248.36598: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853248.36799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853248.36838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853248.36906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853248.38891: stdout chunk (state=3): >>>ansible-tmp-1726853248.3539767-11741-268282284571249=/root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249 <<< 11683 1726853248.39032: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853248.39041: stdout chunk (state=3): >>><<< 11683 1726853248.39052: stderr chunk (state=3): >>><<< 11683 1726853248.39076: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853248.3539767-11741-268282284571249=/root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853248.39376: variable 'ansible_module_compression' from source: unknown 11683 1726853248.39379: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11683 1726853248.39382: variable 'ansible_facts' from source: unknown 11683 1726853248.39776: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/AnsiballZ_setup.py 11683 1726853248.40147: Sending initial data 11683 1726853248.40156: Sent initial data (154 bytes) 11683 1726853248.41787: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853248.43414: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853248.43479: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853248.43590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/AnsiballZ_setup.py" <<< 11683 1726853248.43600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp7gh_2xut /root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/AnsiballZ_setup.py <<< 11683 1726853248.43613: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp7gh_2xut" to remote "/root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/AnsiballZ_setup.py" <<< 11683 1726853248.47291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853248.47304: stdout chunk (state=3): >>><<< 11683 1726853248.47318: stderr chunk (state=3): >>><<< 11683 1726853248.47341: done transferring module to remote 11683 1726853248.47473: _low_level_execute_command(): starting 11683 1726853248.47747: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/ /root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/AnsiballZ_setup.py && sleep 0' 11683 1726853248.48669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853248.48977: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853248.49044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853248.49367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853248.51290: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853248.51598: stderr chunk (state=3): >>><<< 11683 1726853248.51602: stdout chunk (state=3): >>><<< 11683 1726853248.51692: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853248.51695: _low_level_execute_command(): starting 11683 1726853248.51697: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/AnsiballZ_setup.py && sleep 0' 11683 1726853248.52992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853248.53189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853248.53337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853248.56407: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11683 1726853248.56440: stdout chunk (state=3): >>>import _imp # builtin <<< 11683 1726853248.56469: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 11683 1726853248.56664: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 11683 1726853248.56701: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853248.56715: stdout chunk (state=3): >>>import '_codecs' # <<< 11683 1726853248.56741: stdout chunk (state=3): >>>import 'codecs' # <<< 11683 1726853248.56773: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11683 1726853248.56793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11683 1726853248.56815: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4023bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40238bb00> <<< 11683 1726853248.56905: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4023bea50> import '_signal' # <<< 11683 1726853248.56934: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 11683 1726853248.56937: stdout chunk (state=3): >>>import 'io' # <<< 11683 1726853248.57147: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 11683 1726853248.57315: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4023cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4023cdfa0> <<< 11683 1726853248.57328: stdout chunk (state=3): >>>import 'site' # <<< 11683 1726853248.57366: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11683 1726853248.57734: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11683 1726853248.57802: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853248.57809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11683 1726853248.57843: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11683 1726853248.57895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11683 1726853248.57907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021ebe90> <<< 11683 1726853248.57921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11683 1726853248.57986: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021ebf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11683 1726853248.58014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11683 1726853248.58030: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11683 1726853248.58085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853248.58133: stdout chunk (state=3): >>>import 'itertools' # <<< 11683 1726853248.58140: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402223890> <<< 11683 1726853248.58186: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402223f20> <<< 11683 1726853248.58293: stdout chunk (state=3): >>>import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402203b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402201280> <<< 11683 1726853248.58377: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021e9040> <<< 11683 1726853248.58449: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11683 1726853248.58472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11683 1726853248.58497: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11683 1726853248.58500: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11683 1726853248.58577: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402243800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402242420> <<< 11683 1726853248.58580: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 11683 1726853248.58583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402202150> <<< 11683 1726853248.58839: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402240b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402278860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021e82c0> <<< 11683 1726853248.58843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe402278d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402278bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe402278f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021e6de0> <<< 11683 1726853248.58861: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11683 1726853248.58880: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402279610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4022792e0> <<< 11683 1726853248.58954: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11683 1726853248.59034: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40227a510> import 'importlib.util' # import 'runpy' # <<< 11683 1726853248.59215: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402290710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe402291df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402292c90> <<< 11683 1726853248.59323: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4022932f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4022921e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe402293d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4022934a0> <<< 11683 1726853248.59375: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40227a540> <<< 11683 1726853248.59483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11683 1726853248.59545: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401f8bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11683 1726853248.59660: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401fb4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb4470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401fb4590> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11683 1726853248.59752: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853248.60015: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401fb5010> <<< 11683 1726853248.60123: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401fb59d0> <<< 11683 1726853248.60195: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb48c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401f89df0> <<< 11683 1726853248.60328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11683 1726853248.60332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 11683 1726853248.60334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb6de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb5b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40227ac30> <<< 11683 1726853248.60536: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11683 1726853248.60540: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fe3140> <<< 11683 1726853248.60887: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4020034d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 11683 1726853248.60945: stdout chunk (state=3): >>> import 'ntpath' # <<< 11683 1726853248.60988: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 11683 1726853248.61012: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402064200><<< 11683 1726853248.61016: stdout chunk (state=3): >>> <<< 11683 1726853248.61051: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11683 1726853248.61110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 11683 1726853248.61122: stdout chunk (state=3): >>> <<< 11683 1726853248.61142: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py<<< 11683 1726853248.61190: stdout chunk (state=3): >>> <<< 11683 1726853248.61286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11683 1726853248.61364: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402066960> <<< 11683 1726853248.61501: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402064320><<< 11683 1726853248.61562: stdout chunk (state=3): >>> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402029250> <<< 11683 1726853248.61599: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py<<< 11683 1726853248.61624: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019252e0><<< 11683 1726853248.61686: stdout chunk (state=3): >>> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4020022d0><<< 11683 1726853248.61990: stdout chunk (state=3): >>> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb7d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11683 1726853248.62015: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe402002630><<< 11683 1726853248.62027: stdout chunk (state=3): >>> <<< 11683 1726853248.62404: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_twucg1i2/ansible_setup_payload.zip' <<< 11683 1726853248.62418: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.62639: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.62731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11683 1726853248.62793: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 11683 1726853248.62797: stdout chunk (state=3): >>> <<< 11683 1726853248.62981: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40198ef90> import '_typing' # <<< 11683 1726853248.63092: stdout chunk (state=3): >>> <<< 11683 1726853248.63284: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40196de80> <<< 11683 1726853248.63304: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40196cfe0> <<< 11683 1726853248.63321: stdout chunk (state=3): >>># zipimport: zlib available<<< 11683 1726853248.63385: stdout chunk (state=3): >>> import 'ansible' # <<< 11683 1726853248.63403: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.63435: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.63492: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 11683 1726853248.63593: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.65890: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.66885: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 11683 1726853248.66890: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40198ce30> <<< 11683 1726853248.67135: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4019be960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019be6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019be000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11683 1726853248.67139: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019be4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40198f9b0> <<< 11683 1726853248.67143: stdout chunk (state=3): >>>import 'atexit' # <<< 11683 1726853248.67166: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4019bf6e0> <<< 11683 1726853248.67200: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853248.67341: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4019bf920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019bfe30> import 'pwd' # <<< 11683 1726853248.67445: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11683 1726853248.67467: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401829ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40182b8c0> <<< 11683 1726853248.67476: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11683 1726853248.67550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11683 1726853248.67749: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182c2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182d460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182fef0> <<< 11683 1726853248.67770: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4021e6ed0> <<< 11683 1726853248.67820: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182e1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11683 1726853248.67863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11683 1726853248.67878: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11683 1726853248.68091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401837f20> import '_tokenize' # <<< 11683 1726853248.68119: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4018369f0> <<< 11683 1726853248.68124: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401836750> <<< 11683 1726853248.68147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11683 1726853248.68291: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401836cc0> <<< 11683 1726853248.68612: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40187c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40187c350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40187ddc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40187db80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4018802c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40187e450> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11683 1726853248.68615: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853248.68654: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11683 1726853248.68736: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401883aa0> <<< 11683 1726853248.68953: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401880470> <<< 11683 1726853248.69025: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401884830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401884a70> <<< 11683 1726853248.69092: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4018843e0> <<< 11683 1726853248.69167: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40187c500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11683 1726853248.69191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11683 1726853248.69241: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4017104d0> <<< 11683 1726853248.69487: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853248.69596: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401711a90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401886c30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401887fb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401886840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11683 1726853248.69695: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.69808: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.69855: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.69858: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.69873: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available <<< 11683 1726853248.69934: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11683 1726853248.70111: stdout chunk (state=3): >>># zipimport: zlib available<<< 11683 1726853248.70191: stdout chunk (state=3): >>> <<< 11683 1726853248.70335: stdout chunk (state=3): >>># zipimport: zlib available<<< 11683 1726853248.70338: stdout chunk (state=3): >>> <<< 11683 1726853248.71246: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.72310: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401715b80> <<< 11683 1726853248.72529: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017169f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401711a30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.72532: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401716570> # zipimport: zlib available <<< 11683 1726853248.73049: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.73514: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.73655: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.73791: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11683 1726853248.73794: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 11683 1726853248.73824: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.73900: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11683 1726853248.73913: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.74110: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11683 1726853248.74263: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.74597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 11683 1726853248.74660: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401717a10> # zipimport: zlib available <<< 11683 1726853248.74730: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.74808: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 11683 1726853248.74822: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11683 1726853248.74887: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.74890: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.74944: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11683 1726853248.75045: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.75085: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.75353: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4017222d0> <<< 11683 1726853248.75416: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40171d1c0> <<< 11683 1726853248.75432: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 11683 1726853248.75534: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.75537: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.75654: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.75711: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 11683 1726853248.75723: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853248.75742: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11683 1726853248.75888: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11683 1726853248.75914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11683 1726853248.76004: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40180aa80> <<< 11683 1726853248.76117: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019ea750> <<< 11683 1726853248.76178: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4018854f0> <<< 11683 1726853248.76190: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4018366c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11683 1726853248.76234: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.76264: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11683 1726853248.76383: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available <<< 11683 1726853248.76451: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11683 1726853248.76477: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.76567: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.76660: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.76742: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.76774: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.76879: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 11683 1726853248.76961: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.77065: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.77098: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.77288: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 11683 1726853248.77442: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.77711: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.77767: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.77928: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11683 1726853248.77954: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b2840> <<< 11683 1726853248.77983: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 11683 1726853248.78000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11683 1726853248.78057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11683 1726853248.78085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11683 1726853248.78114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11683 1726853248.78141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4013b8170> <<< 11683 1726853248.78193: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4013b84d0> <<< 11683 1726853248.78299: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017a7410> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b3380> <<< 11683 1726853248.78600: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b0f20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b0a40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4013bb440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4013bacf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4013baed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4013ba120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11683 1726853248.78858: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4013bb620> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40141a150> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401418170> <<< 11683 1726853248.78861: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b0c20> import 'ansible.module_utils.facts.timeout' # <<< 11683 1726853248.78864: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 11683 1726853248.78892: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.78899: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 11683 1726853248.78927: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.79099: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 11683 1726853248.79102: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.79158: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11683 1726853248.79178: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 11683 1726853248.79195: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.79460: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.79464: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11683 1726853248.79466: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.79523: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.79636: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.79639: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.79739: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11683 1726853248.80284: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.80948: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.81080: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.81113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 11683 1726853248.81186: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.81302: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.81305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 11683 1726853248.81485: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.81547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11683 1726853248.81591: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40141a300> <<< 11683 1726853248.81637: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11683 1726853248.81668: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11683 1726853248.81852: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40141aed0> <<< 11683 1726853248.81939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.82030: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11683 1726853248.82073: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.82395: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 11683 1726853248.82477: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.82585: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.82647: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11683 1726853248.82764: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11683 1726853248.82831: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853248.82937: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4014564b0> <<< 11683 1726853248.83321: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4014462d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.83377: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11683 1726853248.83390: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.83514: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.83637: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.83999: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.84076: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 11683 1726853248.84079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # <<< 11683 1726853248.84090: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.84134: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.84167: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11683 1726853248.84247: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.84299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11683 1726853248.84337: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853248.84587: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40146a2d0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401469ee0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 11683 1726853248.84791: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.85015: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11683 1726853248.85170: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.85317: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.85373: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.85430: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 11683 1726853248.85463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 11683 1726853248.85477: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.85575: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.85708: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.85925: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 11683 1726853248.85938: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.86142: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.86305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11683 1726853248.86309: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.86357: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.86400: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.87309: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.88106: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 11683 1726853248.88184: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # <<< 11683 1726853248.88511: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 11683 1726853248.88585: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.88729: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11683 1726853248.88830: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.88978: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.89218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11683 1726853248.89242: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.89259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 11683 1726853248.89385: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 11683 1726853248.89388: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.89534: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.89677: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.89969: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.90283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 11683 1726853248.90349: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 11683 1726853248.90396: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # <<< 11683 1726853248.90469: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.90475: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 11683 1726853248.90790: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 11683 1726853248.90793: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 11683 1726853248.90893: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.90918: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 11683 1726853248.90986: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.91060: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11683 1726853248.91087: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.91493: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.91911: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 11683 1726853248.91996: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.92102: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available <<< 11683 1726853248.92219: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.92390: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 11683 1726853248.92470: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.92592: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available <<< 11683 1726853248.92728: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 11683 1726853248.92766: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.92790: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.92857: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.93016: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.93115: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 11683 1726853248.93129: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 11683 1726853248.93212: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.93288: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 11683 1726853248.93291: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.93617: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.93934: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available <<< 11683 1726853248.94003: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.94062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11683 1726853248.94136: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.94392: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853248.94473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 11683 1726853248.94477: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 11683 1726853248.94480: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.94779: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.94794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11683 1726853248.94819: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853248.95625: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11683 1726853248.95641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11683 1726853248.95675: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40126f980> <<< 11683 1726853248.95698: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40126e060> <<< 11683 1726853248.95770: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40126d0d0> <<< 11683 1726853248.97057: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "28", "epoch": "1726853248", "epoch_int": "1726853248", "date": "2024-09-20", "time": "13:27:28", "iso8601_micro": "2024-09-20T17:27:28.949667Z", "iso8601": "2024-09-20T17:27:28Z", "iso8601_basic": "20240920T132728949667", "iso8601_basic_short": "20240920T132728", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11683 1726853248.97872: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy <<< 11683 1726853248.97884: stdout chunk (state=3): >>># destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat <<< 11683 1726853248.97987: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters<<< 11683 1726853248.98036: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env <<< 11683 1726853248.98079: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn <<< 11683 1726853248.98104: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd<<< 11683 1726853248.98114: stdout chunk (state=3): >>> # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 11683 1726853248.98907: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 11683 1726853248.98984: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 11683 1726853248.99019: stdout chunk (state=3): >>># destroy _pickle <<< 11683 1726853248.99062: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 11683 1726853248.99276: stdout chunk (state=3): >>># destroy shlex<<< 11683 1726853248.99280: stdout chunk (state=3): >>> # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 11683 1726853248.99337: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 11683 1726853248.99362: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 11683 1726853248.99393: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 11683 1726853248.99419: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 11683 1726853248.99570: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 11683 1726853248.99596: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11683 1726853248.99767: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11683 1726853248.99796: stdout chunk (state=3): >>># destroy _collections <<< 11683 1726853248.99827: stdout chunk (state=3): >>># destroy platform <<< 11683 1726853248.99842: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 11683 1726853248.99929: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 11683 1726853248.99990: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 11683 1726853249.00099: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11683 1726853249.00424: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 11683 1726853249.00427: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 11683 1726853249.00429: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11683 1726853249.00806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853249.00847: stderr chunk (state=3): >>><<< 11683 1726853249.00857: stdout chunk (state=3): >>><<< 11683 1726853249.01388: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4023bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40238bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4023bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4023cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4023cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021ebe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021ebf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402223890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402223f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402203b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402201280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021e9040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402243800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402242420> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402202150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402240b60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402278860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021e82c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe402278d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402278bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe402278f80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4021e6de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402279610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4022792e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40227a510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402290710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe402291df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402292c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4022932f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4022921e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe402293d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4022934a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40227a540> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401f8bc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401fb4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb4470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401fb4590> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401fb5010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401fb59d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb48c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401f89df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb6de0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb5b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40227ac30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fe3140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4020034d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402064200> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402066960> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402064320> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe402029250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019252e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4020022d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401fb7d40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe402002630> # zipimport: found 103 names in '/tmp/ansible_setup_payload_twucg1i2/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40198ef90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40196de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40196cfe0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40198ce30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4019be960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019be6f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019be000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019be4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40198f9b0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4019bf6e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4019bf920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019bfe30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401829ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40182b8c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182c2c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182d460> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182fef0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4021e6ed0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182e1b0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401837f20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4018369f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401836750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401836cc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40182e6c0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40187c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40187c350> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40187ddc0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40187db80> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4018802c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40187e450> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401883aa0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401880470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401884830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401884a70> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4018843e0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40187c500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4017104d0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401711a90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401886c30> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401887fb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401886840> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe401715b80> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017169f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401711a30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401716570> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401717a10> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4017222d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40171d1c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40180aa80> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4019ea750> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4018854f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4018366c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b2840> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4013b8170> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4013b84d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017a7410> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b3380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b0f20> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b0a40> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4013bb440> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4013bacf0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4013baed0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4013ba120> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4013bb620> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40141a150> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401418170> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4017b0c20> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40141a300> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40141aed0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe4014564b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe4014462d0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40146a2d0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe401469ee0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe40126f980> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40126e060> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe40126d0d0> {"ansible_facts": {"ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "28", "epoch": "1726853248", "epoch_int": "1726853248", "date": "2024-09-20", "time": "13:27:28", "iso8601_micro": "2024-09-20T17:27:28.949667Z", "iso8601": "2024-09-20T17:27:28Z", "iso8601_basic": "20240920T132728949667", "iso8601_basic_short": "20240920T132728", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_local": {}, "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11683 1726853249.03072: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853249.03077: _low_level_execute_command(): starting 11683 1726853249.03085: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853248.3539767-11741-268282284571249/ > /dev/null 2>&1 && sleep 0' 11683 1726853249.03154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853249.03157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853249.03160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853249.03162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853249.03164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853249.03166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853249.03328: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853249.03332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853249.03334: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853249.03507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853249.06157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853249.06452: stderr chunk (state=3): >>><<< 11683 1726853249.06455: stdout chunk (state=3): >>><<< 11683 1726853249.06458: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853249.06460: handler run complete 11683 1726853249.06462: variable 'ansible_facts' from source: unknown 11683 1726853249.06574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853249.06999: variable 'ansible_facts' from source: unknown 11683 1726853249.07058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853249.07116: attempt loop complete, returning result 11683 1726853249.07120: _execute() done 11683 1726853249.07122: dumping result to json 11683 1726853249.07136: done dumping result, returning 11683 1726853249.07147: done running TaskExecutor() for managed_node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [02083763-bbaf-c5b2-e075-0000000000dd] 11683 1726853249.07150: sending task result for task 02083763-bbaf-c5b2-e075-0000000000dd 11683 1726853249.07533: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000dd 11683 1726853249.07536: WORKER PROCESS EXITING ok: [managed_node3] 11683 1726853249.07650: no more pending results, returning what we have 11683 1726853249.07653: results queue empty 11683 1726853249.07654: checking for any_errors_fatal 11683 1726853249.07655: done checking for any_errors_fatal 11683 1726853249.07656: checking for max_fail_percentage 11683 1726853249.07657: done checking for max_fail_percentage 11683 1726853249.07658: checking to see if all hosts have failed and the running result is not ok 11683 1726853249.07659: done checking to see if all hosts have failed 11683 1726853249.07660: getting the remaining hosts for this loop 11683 1726853249.07661: done getting the remaining hosts for this loop 11683 1726853249.07664: getting the next task for host managed_node3 11683 1726853249.07675: done getting next task for host managed_node3 11683 1726853249.07677: ^ task is: TASK: Check if system is ostree 11683 1726853249.07680: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853249.07684: getting variables 11683 1726853249.07685: in VariableManager get_vars() 11683 1726853249.07714: Calling all_inventory to load vars for managed_node3 11683 1726853249.07716: Calling groups_inventory to load vars for managed_node3 11683 1726853249.07720: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853249.07731: Calling all_plugins_play to load vars for managed_node3 11683 1726853249.07735: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853249.07738: Calling groups_plugins_play to load vars for managed_node3 11683 1726853249.08042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853249.08659: done with get_vars() 11683 1726853249.08669: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 13:27:29 -0400 (0:00:00.810) 0:00:02.160 ****** 11683 1726853249.08820: entering _queue_task() for managed_node3/stat 11683 1726853249.09414: worker is 1 (out of 1 available) 11683 1726853249.09426: exiting _queue_task() for managed_node3/stat 11683 1726853249.09440: done queuing things up, now waiting for results queue to drain 11683 1726853249.09441: waiting for pending results... 11683 1726853249.10039: running TaskExecutor() for managed_node3/TASK: Check if system is ostree 11683 1726853249.10064: in run() - task 02083763-bbaf-c5b2-e075-0000000000df 11683 1726853249.10153: variable 'ansible_search_path' from source: unknown 11683 1726853249.10162: variable 'ansible_search_path' from source: unknown 11683 1726853249.10204: calling self._execute() 11683 1726853249.10463: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853249.10467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853249.10469: variable 'omit' from source: magic vars 11683 1726853249.11369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853249.11910: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853249.12021: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853249.12139: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853249.12213: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853249.12384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853249.12503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853249.12541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853249.12576: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853249.12876: Evaluated conditional (not __network_is_ostree is defined): True 11683 1726853249.12888: variable 'omit' from source: magic vars 11683 1726853249.13165: variable 'omit' from source: magic vars 11683 1726853249.13168: variable 'omit' from source: magic vars 11683 1726853249.13172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853249.13199: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853249.13218: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853249.13238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853249.13290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853249.13399: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853249.13420: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853249.13431: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853249.13716: Set connection var ansible_shell_executable to /bin/sh 11683 1726853249.13756: Set connection var ansible_timeout to 10 11683 1726853249.13826: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853249.14029: Set connection var ansible_pipelining to False 11683 1726853249.14032: Set connection var ansible_shell_type to sh 11683 1726853249.14037: Set connection var ansible_connection to ssh 11683 1726853249.14039: variable 'ansible_shell_executable' from source: unknown 11683 1726853249.14042: variable 'ansible_connection' from source: unknown 11683 1726853249.14045: variable 'ansible_module_compression' from source: unknown 11683 1726853249.14047: variable 'ansible_shell_type' from source: unknown 11683 1726853249.14051: variable 'ansible_shell_executable' from source: unknown 11683 1726853249.14054: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853249.14056: variable 'ansible_pipelining' from source: unknown 11683 1726853249.14058: variable 'ansible_timeout' from source: unknown 11683 1726853249.14060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853249.14321: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853249.14368: variable 'omit' from source: magic vars 11683 1726853249.14381: starting attempt loop 11683 1726853249.14397: running the handler 11683 1726853249.14504: _low_level_execute_command(): starting 11683 1726853249.14508: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853249.15823: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853249.15852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853249.15993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853249.16078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853249.16199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853249.16258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853249.18504: stdout chunk (state=3): >>>/root <<< 11683 1726853249.18638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853249.18868: stdout chunk (state=3): >>><<< 11683 1726853249.18874: stderr chunk (state=3): >>><<< 11683 1726853249.18878: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853249.18887: _low_level_execute_command(): starting 11683 1726853249.18890: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489 `" && echo ansible-tmp-1726853249.1879802-11778-24169317156489="` echo /root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489 `" ) && sleep 0' 11683 1726853249.20112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853249.20281: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853249.20409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853249.20466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853249.22483: stdout chunk (state=3): >>>ansible-tmp-1726853249.1879802-11778-24169317156489=/root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489 <<< 11683 1726853249.22727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853249.22730: stdout chunk (state=3): >>><<< 11683 1726853249.22733: stderr chunk (state=3): >>><<< 11683 1726853249.22750: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853249.1879802-11778-24169317156489=/root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853249.22812: variable 'ansible_module_compression' from source: unknown 11683 1726853249.23180: ANSIBALLZ: Using lock for stat 11683 1726853249.23183: ANSIBALLZ: Acquiring lock 11683 1726853249.23186: ANSIBALLZ: Lock acquired: 139785061357072 11683 1726853249.23189: ANSIBALLZ: Creating module 11683 1726853249.49011: ANSIBALLZ: Writing module into payload 11683 1726853249.49128: ANSIBALLZ: Writing module 11683 1726853249.49146: ANSIBALLZ: Renaming module 11683 1726853249.49155: ANSIBALLZ: Done creating module 11683 1726853249.49181: variable 'ansible_facts' from source: unknown 11683 1726853249.49493: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/AnsiballZ_stat.py 11683 1726853249.50050: Sending initial data 11683 1726853249.50054: Sent initial data (152 bytes) 11683 1726853249.50896: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853249.50907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853249.50987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853249.51014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853249.51034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853249.51046: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853249.51226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853249.53089: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/AnsiballZ_stat.py" <<< 11683 1726853249.53093: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpqglzh9kg /root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/AnsiballZ_stat.py <<< 11683 1726853249.53123: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpqglzh9kg" to remote "/root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/AnsiballZ_stat.py" <<< 11683 1726853249.54657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853249.54892: stderr chunk (state=3): >>><<< 11683 1726853249.54895: stdout chunk (state=3): >>><<< 11683 1726853249.54905: done transferring module to remote 11683 1726853249.55169: _low_level_execute_command(): starting 11683 1726853249.55176: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/ /root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/AnsiballZ_stat.py && sleep 0' 11683 1726853249.56216: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853249.56232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853249.56243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853249.56398: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853249.56410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853249.56421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853249.56601: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853249.59240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853249.59281: stderr chunk (state=3): >>><<< 11683 1726853249.59354: stdout chunk (state=3): >>><<< 11683 1726853249.59375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853249.59379: _low_level_execute_command(): starting 11683 1726853249.59386: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/AnsiballZ_stat.py && sleep 0' 11683 1726853249.61277: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853249.61283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853249.61476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853249.61525: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853249.61665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853249.64847: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11683 1726853249.64904: stdout chunk (state=3): >>>import _imp # builtin <<< 11683 1726853249.64936: stdout chunk (state=3): >>>import '_thread' # <<< 11683 1726853249.64966: stdout chunk (state=3): >>> import '_warnings' # import '_weakref' # <<< 11683 1726853249.65003: stdout chunk (state=3): >>> <<< 11683 1726853249.65068: stdout chunk (state=3): >>>import '_io' # <<< 11683 1726853249.65108: stdout chunk (state=3): >>> import 'marshal' # <<< 11683 1726853249.65206: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 11683 1726853249.65252: stdout chunk (state=3): >>>import 'time' # <<< 11683 1726853249.65302: stdout chunk (state=3): >>> import 'zipimport' # <<< 11683 1726853249.65386: stdout chunk (state=3): >>># installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853249.65414: stdout chunk (state=3): >>>import '_codecs' # <<< 11683 1726853249.65627: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 11683 1726853249.65666: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc476184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc475e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4761aa50> <<< 11683 1726853249.65695: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 11683 1726853249.65698: stdout chunk (state=3): >>>import 'abc' # <<< 11683 1726853249.65733: stdout chunk (state=3): >>> import 'io' # <<< 11683 1726853249.65785: stdout chunk (state=3): >>>import '_stat' # <<< 11683 1726853249.66020: stdout chunk (state=3): >>>import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 11683 1726853249.66039: stdout chunk (state=3): >>>import 'os' # <<< 11683 1726853249.66075: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11683 1726853249.66349: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4742d130> <<< 11683 1726853249.66357: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11683 1726853249.66361: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 11683 1726853249.66363: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4742dfa0><<< 11683 1726853249.66458: stdout chunk (state=3): >>> import 'site' # <<< 11683 1726853249.66462: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux <<< 11683 1726853249.66606: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 11683 1726853249.66834: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 11683 1726853249.66867: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11683 1726853249.66914: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11683 1726853249.66933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853249.66960: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 11683 1726853249.66979: stdout chunk (state=3): >>> <<< 11683 1726853249.67040: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11683 1726853249.67067: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11683 1726853249.67118: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 11683 1726853249.67146: stdout chunk (state=3): >>> import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4746bec0> <<< 11683 1726853249.67174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11683 1726853249.67201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 11683 1726853249.67260: stdout chunk (state=3): >>> import '_operator' # <<< 11683 1726853249.67270: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4746bf80> <<< 11683 1726853249.67289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 11683 1726853249.67346: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11683 1726853249.67389: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py<<< 11683 1726853249.67391: stdout chunk (state=3): >>> <<< 11683 1726853249.67463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc'<<< 11683 1726853249.67500: stdout chunk (state=3): >>> import 'itertools' # <<< 11683 1726853249.67535: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 11683 1726853249.67595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 11683 1726853249.67629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474a3ec0><<< 11683 1726853249.67642: stdout chunk (state=3): >>> import '_collections' # <<< 11683 1726853249.67731: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47483b60><<< 11683 1726853249.67743: stdout chunk (state=3): >>> import '_functools' # <<< 11683 1726853249.67932: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47469070> <<< 11683 1726853249.67976: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py<<< 11683 1726853249.68028: stdout chunk (state=3): >>> <<< 11683 1726853249.68048: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 11683 1726853249.68087: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11683 1726853249.68156: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11683 1726853249.68188: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11683 1726853249.68191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11683 1726853249.68234: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474c37d0> <<< 11683 1726853249.68282: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474c23f0> <<< 11683 1726853249.68340: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc'<<< 11683 1726853249.68343: stdout chunk (state=3): >>> import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474c0bc0><<< 11683 1726853249.68439: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11683 1726853249.68480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474682f0> <<< 11683 1726853249.68524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11683 1726853249.68576: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 11683 1726853249.68592: stdout chunk (state=3): >>> # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc474f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474f8bf0><<< 11683 1726853249.68670: stdout chunk (state=3): >>> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853249.68690: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc474f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47466e10><<< 11683 1726853249.68746: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 11683 1726853249.68781: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853249.68804: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11683 1726853249.68883: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474f9670> <<< 11683 1726853249.68896: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474f9370> import 'importlib.machinery' # <<< 11683 1726853249.68990: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc'<<< 11683 1726853249.69025: stdout chunk (state=3): >>> import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474fa540> import 'importlib.util' # import 'runpy' # <<< 11683 1726853249.69222: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47510740> <<< 11683 1726853249.69279: stdout chunk (state=3): >>>import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc47511e20><<< 11683 1726853249.69388: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 11683 1726853249.69490: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47512cc0> <<< 11683 1726853249.69530: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc475132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47512210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11683 1726853249.69609: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 11683 1726853249.69633: stdout chunk (state=3): >>> # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc47513d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc475134a0> <<< 11683 1726853249.69729: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11683 1726853249.69911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 11683 1726853249.70007: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472d3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472fc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472fc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472fc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 11683 1726853249.70036: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11683 1726853249.70110: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853249.70355: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472fd070> <<< 11683 1726853249.70602: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472fda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472fc920> <<< 11683 1726853249.70717: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472d1df0> <<< 11683 1726853249.70726: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 11683 1726853249.70789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc'<<< 11683 1726853249.70897: stdout chunk (state=3): >>> import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472fee10> <<< 11683 1726853249.70942: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472fdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 11683 1726853249.70965: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc'<<< 11683 1726853249.71260: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11683 1726853249.71281: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc473271a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11683 1726853249.71394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4734b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11683 1726853249.71462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 11683 1726853249.71496: stdout chunk (state=3): >>> <<< 11683 1726853249.71564: stdout chunk (state=3): >>>import 'ntpath' # <<< 11683 1726853249.71609: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc473ac2c0><<< 11683 1726853249.71637: stdout chunk (state=3): >>> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py<<< 11683 1726853249.71658: stdout chunk (state=3): >>> <<< 11683 1726853249.71704: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 11683 1726853249.71891: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11683 1726853249.71912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11683 1726853249.71938: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc473aea20><<< 11683 1726853249.72066: stdout chunk (state=3): >>> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc473ac3e0> <<< 11683 1726853249.72129: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4736d2b0> <<< 11683 1726853249.72182: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 11683 1726853249.72204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc471ad3d0> <<< 11683 1726853249.72249: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4734a360> <<< 11683 1726853249.72268: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472ffd70> <<< 11683 1726853249.72459: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11683 1726853249.72484: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7efc471ad670><<< 11683 1726853249.72561: stdout chunk (state=3): >>> <<< 11683 1726853249.72711: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_3cnp18eh/ansible_stat_payload.zip'<<< 11683 1726853249.72799: stdout chunk (state=3): >>> <<< 11683 1726853249.72889: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.72964: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.73009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11683 1726853249.73042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11683 1726853249.73110: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11683 1726853249.73274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11683 1726853249.73341: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 11683 1726853249.73367: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47203170> import '_typing' # <<< 11683 1726853249.73813: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc471e2060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc471e11f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11683 1726853249.76049: stdout chunk (state=3): >>># zipimport: zlib available<<< 11683 1726853249.76206: stdout chunk (state=3): >>> <<< 11683 1726853249.77994: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py<<< 11683 1726853249.78230: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47201040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc4722aae0> <<< 11683 1726853249.78280: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4722a870> <<< 11683 1726853249.78495: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4722a180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4722abd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47203e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc4722b860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc4722b9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11683 1726853249.78565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11683 1726853249.78588: stdout chunk (state=3): >>>import '_locale' # <<< 11683 1726853249.78655: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4722bef0> <<< 11683 1726853249.78678: stdout chunk (state=3): >>>import 'pwd' # <<< 11683 1726853249.78700: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11683 1726853249.78754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11683 1726853249.78894: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b0dc70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b0f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11683 1726853249.78973: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b10290> <<< 11683 1726853249.79010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11683 1726853249.79098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b11430> <<< 11683 1726853249.79114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11683 1726853249.79130: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11683 1726853249.79198: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11683 1726853249.79253: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b13f20> <<< 11683 1726853249.79348: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b185c0> <<< 11683 1726853249.79378: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b121e0> <<< 11683 1726853249.79412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py<<< 11683 1726853249.79550: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11683 1726853249.79574: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11683 1726853249.79614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py<<< 11683 1726853249.79668: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11683 1726853249.79674: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b1bec0> import '_tokenize' # <<< 11683 1726853249.79778: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b1a990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b1a6f0><<< 11683 1726853249.79861: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 11683 1726853249.79866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 11683 1726853249.79993: stdout chunk (state=3): >>> import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b1ac60> <<< 11683 1726853249.80004: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b12660> <<< 11683 1726853249.80133: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b63ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b641a0> <<< 11683 1726853249.80163: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py<<< 11683 1726853249.80188: stdout chunk (state=3): >>> <<< 11683 1726853249.80340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 11683 1726853249.80343: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b65c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b65a00> <<< 11683 1726853249.80389: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11683 1726853249.80565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11683 1726853249.80700: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b681a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b66300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 11683 1726853249.80719: stdout chunk (state=3): >>>import '_string' # <<< 11683 1726853249.80754: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b6b950> <<< 11683 1726853249.80887: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b68350> <<< 11683 1726853249.80951: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b6ca10> <<< 11683 1726853249.81034: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b6cb60> <<< 11683 1726853249.81133: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b6cb30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b642c0> <<< 11683 1726853249.81136: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11683 1726853249.81180: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11683 1726853249.81183: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46bf43b0> <<< 11683 1726853249.81576: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46bf55e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b6eb40> <<< 11683 1726853249.81617: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b6fef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b6e7b0> <<< 11683 1726853249.81632: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853249.81665: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available <<< 11683 1726853249.81691: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11683 1726853249.81812: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.81951: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.82709: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.83514: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 11683 1726853249.83610: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853249.83664: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46bfd7c0> <<< 11683 1726853249.83839: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bfe570> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bf5730> <<< 11683 1726853249.83882: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11683 1726853249.83938: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853249.83958: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11683 1726853249.84228: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.84462: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bfe210> <<< 11683 1726853249.84517: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.85350: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.85722: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.85932: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.85956: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 11683 1726853249.85967: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.86043: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.86134: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11683 1726853249.86164: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11683 1726853249.86261: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11683 1726853249.86491: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.86727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11683 1726853249.86792: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11683 1726853249.86810: stdout chunk (state=3): >>>import '_ast' # <<< 11683 1726853249.86899: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bff620> <<< 11683 1726853249.86911: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.86957: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.87109: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11683 1726853249.87123: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11683 1726853249.87164: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11683 1726853249.87186: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.87208: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.87265: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.87312: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.87424: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11683 1726853249.87465: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11683 1726853249.87518: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46a0a090> <<< 11683 1726853249.87560: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46a07d10> <<< 11683 1726853249.87605: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 11683 1726853249.87616: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.87687: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.87929: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.87932: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11683 1726853249.87965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 11683 1726853249.87982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11683 1726853249.88035: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472867b0> <<< 11683 1726853249.88077: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47272480> <<< 11683 1726853249.88168: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bfc770> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46a004d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11683 1726853249.88197: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.88214: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.88245: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11683 1726853249.88295: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11683 1726853249.88361: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11683 1726853249.88669: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.88697: stdout chunk (state=3): >>># zipimport: zlib available <<< 11683 1726853249.88812: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11683 1726853249.88827: stdout chunk (state=3): >>># destroy __main__ <<< 11683 1726853249.89204: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 11683 1726853249.89276: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc<<< 11683 1726853249.89279: stdout chunk (state=3): >>> # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix <<< 11683 1726853249.89282: stdout chunk (state=3): >>># cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy <<< 11683 1726853249.89346: stdout chunk (state=3): >>># destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec <<< 11683 1726853249.89445: stdout chunk (state=3): >>># destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 11683 1726853249.89627: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11683 1726853249.89713: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 11683 1726853249.89835: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess <<< 11683 1726853249.89961: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime <<< 11683 1726853249.89965: stdout chunk (state=3): >>># destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 11683 1726853249.89975: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid<<< 11683 1726853249.90157: stdout chunk (state=3): >>> # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 11683 1726853249.90166: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11683 1726853249.90359: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11683 1726853249.90362: stdout chunk (state=3): >>># destroy _collections <<< 11683 1726853249.90459: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 11683 1726853249.90484: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11683 1726853249.90562: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 11683 1726853249.90696: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib <<< 11683 1726853249.90699: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11683 1726853249.91183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853249.91186: stdout chunk (state=3): >>><<< 11683 1726853249.91189: stderr chunk (state=3): >>><<< 11683 1726853249.91528: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc476184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc475e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4761aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4742d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4742dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4746bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4746bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474a3830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474a3ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47483b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474812b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47469070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474c37d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474c23f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47482150> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474f8890> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474682f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc474f8d40> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474f8bf0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc474f8fe0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47466e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474f9670> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474f9370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474fa540> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47510740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc47511e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47512cc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc475132f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47512210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc47513d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc475134a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474fa4b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472d3c50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472fc710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472fc470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472fc740> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472fd070> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc472fda60> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472fc920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472d1df0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472fee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472fdb50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc474fac60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc473271a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4734b560> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc473ac2c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc473aea20> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc473ac3e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4736d2b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc471ad3d0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4734a360> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472ffd70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7efc471ad670> # zipimport: found 30 names in '/tmp/ansible_stat_payload_3cnp18eh/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47203170> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc471e2060> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc471e11f0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47201040> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc4722aae0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4722a870> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4722a180> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4722abd0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47203e00> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc4722b860> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc4722b9e0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc4722bef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b0dc70> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b0f890> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b10290> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b11430> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b13f20> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b185c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b121e0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b1bec0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b1a990> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b1a6f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b1ac60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b12660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b63ec0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b641a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b65c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b65a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b681a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b66300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b6b950> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b68350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b6ca10> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b6cb60> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b6cb30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b642c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46bf43b0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46bf55e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b6eb40> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46b6fef0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46b6e7b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46bfd7c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bfe570> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bf5730> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bfe210> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bff620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7efc46a0a090> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46a07d10> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc472867b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc47272480> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46bfc770> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7efc46a004d0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11683 1726853249.92839: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853249.92842: _low_level_execute_command(): starting 11683 1726853249.92847: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853249.1879802-11778-24169317156489/ > /dev/null 2>&1 && sleep 0' 11683 1726853249.93307: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853249.93366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853249.93535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853249.93538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853249.93540: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853249.93925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853249.95838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853249.95842: stdout chunk (state=3): >>><<< 11683 1726853249.95856: stderr chunk (state=3): >>><<< 11683 1726853249.95868: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853249.95992: handler run complete 11683 1726853249.96010: attempt loop complete, returning result 11683 1726853249.96014: _execute() done 11683 1726853249.96016: dumping result to json 11683 1726853249.96019: done dumping result, returning 11683 1726853249.96028: done running TaskExecutor() for managed_node3/TASK: Check if system is ostree [02083763-bbaf-c5b2-e075-0000000000df] 11683 1726853249.96031: sending task result for task 02083763-bbaf-c5b2-e075-0000000000df ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11683 1726853249.96318: no more pending results, returning what we have 11683 1726853249.96321: results queue empty 11683 1726853249.96322: checking for any_errors_fatal 11683 1726853249.96328: done checking for any_errors_fatal 11683 1726853249.96329: checking for max_fail_percentage 11683 1726853249.96330: done checking for max_fail_percentage 11683 1726853249.96331: checking to see if all hosts have failed and the running result is not ok 11683 1726853249.96332: done checking to see if all hosts have failed 11683 1726853249.96333: getting the remaining hosts for this loop 11683 1726853249.96334: done getting the remaining hosts for this loop 11683 1726853249.96338: getting the next task for host managed_node3 11683 1726853249.96343: done getting next task for host managed_node3 11683 1726853249.96346: ^ task is: TASK: Set flag to indicate system is ostree 11683 1726853249.96348: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853249.96352: getting variables 11683 1726853249.96354: in VariableManager get_vars() 11683 1726853249.96384: Calling all_inventory to load vars for managed_node3 11683 1726853249.96387: Calling groups_inventory to load vars for managed_node3 11683 1726853249.96513: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853249.96529: Calling all_plugins_play to load vars for managed_node3 11683 1726853249.96532: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853249.96535: Calling groups_plugins_play to load vars for managed_node3 11683 1726853249.97015: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000df 11683 1726853249.97019: WORKER PROCESS EXITING 11683 1726853249.97191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853249.97539: done with get_vars() 11683 1726853249.97553: done getting variables 11683 1726853249.98083: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 13:27:29 -0400 (0:00:00.892) 0:00:03.053 ****** 11683 1726853249.98113: entering _queue_task() for managed_node3/set_fact 11683 1726853249.98115: Creating lock for set_fact 11683 1726853249.98880: worker is 1 (out of 1 available) 11683 1726853249.98892: exiting _queue_task() for managed_node3/set_fact 11683 1726853249.98903: done queuing things up, now waiting for results queue to drain 11683 1726853249.98905: waiting for pending results... 11683 1726853249.99151: running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree 11683 1726853249.99422: in run() - task 02083763-bbaf-c5b2-e075-0000000000e0 11683 1726853249.99608: variable 'ansible_search_path' from source: unknown 11683 1726853249.99612: variable 'ansible_search_path' from source: unknown 11683 1726853249.99615: calling self._execute() 11683 1726853249.99648: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853249.99660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853249.99690: variable 'omit' from source: magic vars 11683 1726853250.00936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853250.01367: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853250.01424: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853250.01460: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853250.01498: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853250.01597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853250.01626: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853250.01663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853250.01749: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853250.01815: Evaluated conditional (not __network_is_ostree is defined): True 11683 1726853250.01829: variable 'omit' from source: magic vars 11683 1726853250.01881: variable 'omit' from source: magic vars 11683 1726853250.02003: variable '__ostree_booted_stat' from source: set_fact 11683 1726853250.02052: variable 'omit' from source: magic vars 11683 1726853250.02089: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853250.02121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853250.02181: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853250.02185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853250.02188: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853250.02222: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853250.02231: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.02239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.02336: Set connection var ansible_shell_executable to /bin/sh 11683 1726853250.02397: Set connection var ansible_timeout to 10 11683 1726853250.02400: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853250.02402: Set connection var ansible_pipelining to False 11683 1726853250.02404: Set connection var ansible_shell_type to sh 11683 1726853250.02406: Set connection var ansible_connection to ssh 11683 1726853250.02408: variable 'ansible_shell_executable' from source: unknown 11683 1726853250.02418: variable 'ansible_connection' from source: unknown 11683 1726853250.02426: variable 'ansible_module_compression' from source: unknown 11683 1726853250.02433: variable 'ansible_shell_type' from source: unknown 11683 1726853250.02440: variable 'ansible_shell_executable' from source: unknown 11683 1726853250.02447: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.02506: variable 'ansible_pipelining' from source: unknown 11683 1726853250.02509: variable 'ansible_timeout' from source: unknown 11683 1726853250.02511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.02573: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853250.02589: variable 'omit' from source: magic vars 11683 1726853250.02599: starting attempt loop 11683 1726853250.02605: running the handler 11683 1726853250.02627: handler run complete 11683 1726853250.02641: attempt loop complete, returning result 11683 1726853250.02648: _execute() done 11683 1726853250.02654: dumping result to json 11683 1726853250.02724: done dumping result, returning 11683 1726853250.02727: done running TaskExecutor() for managed_node3/TASK: Set flag to indicate system is ostree [02083763-bbaf-c5b2-e075-0000000000e0] 11683 1726853250.02728: sending task result for task 02083763-bbaf-c5b2-e075-0000000000e0 ok: [managed_node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11683 1726853250.02843: no more pending results, returning what we have 11683 1726853250.02849: results queue empty 11683 1726853250.02849: checking for any_errors_fatal 11683 1726853250.02855: done checking for any_errors_fatal 11683 1726853250.02856: checking for max_fail_percentage 11683 1726853250.02857: done checking for max_fail_percentage 11683 1726853250.02858: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.02863: done checking to see if all hosts have failed 11683 1726853250.02863: getting the remaining hosts for this loop 11683 1726853250.02865: done getting the remaining hosts for this loop 11683 1726853250.02869: getting the next task for host managed_node3 11683 1726853250.02879: done getting next task for host managed_node3 11683 1726853250.02882: ^ task is: TASK: Fix CentOS6 Base repo 11683 1726853250.02885: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.02888: getting variables 11683 1726853250.02890: in VariableManager get_vars() 11683 1726853250.02998: Calling all_inventory to load vars for managed_node3 11683 1726853250.03001: Calling groups_inventory to load vars for managed_node3 11683 1726853250.03174: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.03187: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.03190: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.03192: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.03700: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000e0 11683 1726853250.03715: WORKER PROCESS EXITING 11683 1726853250.03907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.04275: done with get_vars() 11683 1726853250.04295: done getting variables 11683 1726853250.04416: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 13:27:30 -0400 (0:00:00.063) 0:00:03.116 ****** 11683 1726853250.04444: entering _queue_task() for managed_node3/copy 11683 1726853250.04904: worker is 1 (out of 1 available) 11683 1726853250.04913: exiting _queue_task() for managed_node3/copy 11683 1726853250.04922: done queuing things up, now waiting for results queue to drain 11683 1726853250.04924: waiting for pending results... 11683 1726853250.05018: running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo 11683 1726853250.05123: in run() - task 02083763-bbaf-c5b2-e075-0000000000e2 11683 1726853250.05151: variable 'ansible_search_path' from source: unknown 11683 1726853250.05159: variable 'ansible_search_path' from source: unknown 11683 1726853250.05199: calling self._execute() 11683 1726853250.05282: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.05294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.05311: variable 'omit' from source: magic vars 11683 1726853250.05803: variable 'ansible_distribution' from source: facts 11683 1726853250.05832: Evaluated conditional (ansible_distribution == 'CentOS'): True 11683 1726853250.05959: variable 'ansible_distribution_major_version' from source: facts 11683 1726853250.06078: Evaluated conditional (ansible_distribution_major_version == '6'): False 11683 1726853250.06081: when evaluation is False, skipping this task 11683 1726853250.06083: _execute() done 11683 1726853250.06086: dumping result to json 11683 1726853250.06088: done dumping result, returning 11683 1726853250.06091: done running TaskExecutor() for managed_node3/TASK: Fix CentOS6 Base repo [02083763-bbaf-c5b2-e075-0000000000e2] 11683 1726853250.06093: sending task result for task 02083763-bbaf-c5b2-e075-0000000000e2 11683 1726853250.06169: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000e2 11683 1726853250.06174: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11683 1726853250.06415: no more pending results, returning what we have 11683 1726853250.06419: results queue empty 11683 1726853250.06420: checking for any_errors_fatal 11683 1726853250.06424: done checking for any_errors_fatal 11683 1726853250.06425: checking for max_fail_percentage 11683 1726853250.06427: done checking for max_fail_percentage 11683 1726853250.06428: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.06429: done checking to see if all hosts have failed 11683 1726853250.06430: getting the remaining hosts for this loop 11683 1726853250.06432: done getting the remaining hosts for this loop 11683 1726853250.06435: getting the next task for host managed_node3 11683 1726853250.06441: done getting next task for host managed_node3 11683 1726853250.06444: ^ task is: TASK: Include the task 'enable_epel.yml' 11683 1726853250.06447: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.06452: getting variables 11683 1726853250.06453: in VariableManager get_vars() 11683 1726853250.06486: Calling all_inventory to load vars for managed_node3 11683 1726853250.06489: Calling groups_inventory to load vars for managed_node3 11683 1726853250.06493: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.06789: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.06794: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.06798: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.07159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.07716: done with get_vars() 11683 1726853250.07725: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 13:27:30 -0400 (0:00:00.034) 0:00:03.151 ****** 11683 1726853250.07933: entering _queue_task() for managed_node3/include_tasks 11683 1726853250.08631: worker is 1 (out of 1 available) 11683 1726853250.08640: exiting _queue_task() for managed_node3/include_tasks 11683 1726853250.08651: done queuing things up, now waiting for results queue to drain 11683 1726853250.08652: waiting for pending results... 11683 1726853250.09152: running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' 11683 1726853250.09157: in run() - task 02083763-bbaf-c5b2-e075-0000000000e3 11683 1726853250.09161: variable 'ansible_search_path' from source: unknown 11683 1726853250.09164: variable 'ansible_search_path' from source: unknown 11683 1726853250.09210: calling self._execute() 11683 1726853250.09468: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.09474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.09477: variable 'omit' from source: magic vars 11683 1726853250.10406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853250.15065: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853250.15193: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853250.15386: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853250.15426: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853250.15499: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853250.15704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853250.15734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853250.15764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853250.15855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853250.15940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853250.16183: variable '__network_is_ostree' from source: set_fact 11683 1726853250.16232: Evaluated conditional (not __network_is_ostree | d(false)): True 11683 1726853250.16468: _execute() done 11683 1726853250.16473: dumping result to json 11683 1726853250.16476: done dumping result, returning 11683 1726853250.16478: done running TaskExecutor() for managed_node3/TASK: Include the task 'enable_epel.yml' [02083763-bbaf-c5b2-e075-0000000000e3] 11683 1726853250.16481: sending task result for task 02083763-bbaf-c5b2-e075-0000000000e3 11683 1726853250.16549: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000e3 11683 1726853250.16552: WORKER PROCESS EXITING 11683 1726853250.16603: no more pending results, returning what we have 11683 1726853250.16608: in VariableManager get_vars() 11683 1726853250.16645: Calling all_inventory to load vars for managed_node3 11683 1726853250.16648: Calling groups_inventory to load vars for managed_node3 11683 1726853250.16652: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.16665: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.16668: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.16775: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.17266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.17699: done with get_vars() 11683 1726853250.17708: variable 'ansible_search_path' from source: unknown 11683 1726853250.17709: variable 'ansible_search_path' from source: unknown 11683 1726853250.17861: we have included files to process 11683 1726853250.17862: generating all_blocks data 11683 1726853250.17864: done generating all_blocks data 11683 1726853250.17869: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11683 1726853250.17872: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11683 1726853250.17875: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11683 1726853250.19341: done processing included file 11683 1726853250.19343: iterating over new_blocks loaded from include file 11683 1726853250.19344: in VariableManager get_vars() 11683 1726853250.19358: done with get_vars() 11683 1726853250.19359: filtering new block on tags 11683 1726853250.19500: done filtering new block on tags 11683 1726853250.19503: in VariableManager get_vars() 11683 1726853250.19515: done with get_vars() 11683 1726853250.19517: filtering new block on tags 11683 1726853250.19528: done filtering new block on tags 11683 1726853250.19531: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node3 11683 1726853250.19537: extending task lists for all hosts with included blocks 11683 1726853250.19763: done extending task lists 11683 1726853250.19764: done processing included files 11683 1726853250.19765: results queue empty 11683 1726853250.19766: checking for any_errors_fatal 11683 1726853250.19769: done checking for any_errors_fatal 11683 1726853250.19770: checking for max_fail_percentage 11683 1726853250.19875: done checking for max_fail_percentage 11683 1726853250.19877: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.19877: done checking to see if all hosts have failed 11683 1726853250.19878: getting the remaining hosts for this loop 11683 1726853250.19879: done getting the remaining hosts for this loop 11683 1726853250.19882: getting the next task for host managed_node3 11683 1726853250.19886: done getting next task for host managed_node3 11683 1726853250.19888: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11683 1726853250.19891: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.19893: getting variables 11683 1726853250.19894: in VariableManager get_vars() 11683 1726853250.19903: Calling all_inventory to load vars for managed_node3 11683 1726853250.19905: Calling groups_inventory to load vars for managed_node3 11683 1726853250.19917: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.19923: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.19934: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.19938: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.20296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.20731: done with get_vars() 11683 1726853250.20740: done getting variables 11683 1726853250.20923: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11683 1726853250.21380: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 13:27:30 -0400 (0:00:00.134) 0:00:03.287 ****** 11683 1726853250.21549: entering _queue_task() for managed_node3/command 11683 1726853250.21551: Creating lock for command 11683 1726853250.22245: worker is 1 (out of 1 available) 11683 1726853250.22256: exiting _queue_task() for managed_node3/command 11683 1726853250.22266: done queuing things up, now waiting for results queue to drain 11683 1726853250.22268: waiting for pending results... 11683 1726853250.22549: running TaskExecutor() for managed_node3/TASK: Create EPEL 10 11683 1726853250.22640: in run() - task 02083763-bbaf-c5b2-e075-0000000000fd 11683 1726853250.22651: variable 'ansible_search_path' from source: unknown 11683 1726853250.22654: variable 'ansible_search_path' from source: unknown 11683 1726853250.23477: calling self._execute() 11683 1726853250.23505: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.23509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.23512: variable 'omit' from source: magic vars 11683 1726853250.24335: variable 'ansible_distribution' from source: facts 11683 1726853250.24340: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11683 1726853250.24415: variable 'ansible_distribution_major_version' from source: facts 11683 1726853250.24776: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11683 1726853250.24781: when evaluation is False, skipping this task 11683 1726853250.24783: _execute() done 11683 1726853250.24787: dumping result to json 11683 1726853250.24790: done dumping result, returning 11683 1726853250.24796: done running TaskExecutor() for managed_node3/TASK: Create EPEL 10 [02083763-bbaf-c5b2-e075-0000000000fd] 11683 1726853250.24799: sending task result for task 02083763-bbaf-c5b2-e075-0000000000fd 11683 1726853250.24879: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000fd 11683 1726853250.24882: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11683 1726853250.24979: no more pending results, returning what we have 11683 1726853250.24983: results queue empty 11683 1726853250.24984: checking for any_errors_fatal 11683 1726853250.24985: done checking for any_errors_fatal 11683 1726853250.24986: checking for max_fail_percentage 11683 1726853250.24987: done checking for max_fail_percentage 11683 1726853250.24988: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.24989: done checking to see if all hosts have failed 11683 1726853250.24989: getting the remaining hosts for this loop 11683 1726853250.24997: done getting the remaining hosts for this loop 11683 1726853250.25001: getting the next task for host managed_node3 11683 1726853250.25007: done getting next task for host managed_node3 11683 1726853250.25009: ^ task is: TASK: Install yum-utils package 11683 1726853250.25012: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.25016: getting variables 11683 1726853250.25017: in VariableManager get_vars() 11683 1726853250.25131: Calling all_inventory to load vars for managed_node3 11683 1726853250.25138: Calling groups_inventory to load vars for managed_node3 11683 1726853250.25142: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.25155: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.25157: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.25160: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.25562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.25999: done with get_vars() 11683 1726853250.26014: done getting variables 11683 1726853250.26211: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 13:27:30 -0400 (0:00:00.047) 0:00:03.335 ****** 11683 1726853250.26322: entering _queue_task() for managed_node3/package 11683 1726853250.26324: Creating lock for package 11683 1726853250.26863: worker is 1 (out of 1 available) 11683 1726853250.26991: exiting _queue_task() for managed_node3/package 11683 1726853250.27003: done queuing things up, now waiting for results queue to drain 11683 1726853250.27004: waiting for pending results... 11683 1726853250.27589: running TaskExecutor() for managed_node3/TASK: Install yum-utils package 11683 1726853250.27594: in run() - task 02083763-bbaf-c5b2-e075-0000000000fe 11683 1726853250.27597: variable 'ansible_search_path' from source: unknown 11683 1726853250.27599: variable 'ansible_search_path' from source: unknown 11683 1726853250.27679: calling self._execute() 11683 1726853250.27822: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.27985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.28001: variable 'omit' from source: magic vars 11683 1726853250.28661: variable 'ansible_distribution' from source: facts 11683 1726853250.28976: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11683 1726853250.28979: variable 'ansible_distribution_major_version' from source: facts 11683 1726853250.28982: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11683 1726853250.28985: when evaluation is False, skipping this task 11683 1726853250.28987: _execute() done 11683 1726853250.28989: dumping result to json 11683 1726853250.28991: done dumping result, returning 11683 1726853250.29677: done running TaskExecutor() for managed_node3/TASK: Install yum-utils package [02083763-bbaf-c5b2-e075-0000000000fe] 11683 1726853250.29681: sending task result for task 02083763-bbaf-c5b2-e075-0000000000fe 11683 1726853250.29760: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000fe 11683 1726853250.29764: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11683 1726853250.29978: no more pending results, returning what we have 11683 1726853250.29982: results queue empty 11683 1726853250.29983: checking for any_errors_fatal 11683 1726853250.29987: done checking for any_errors_fatal 11683 1726853250.29987: checking for max_fail_percentage 11683 1726853250.29989: done checking for max_fail_percentage 11683 1726853250.29990: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.29991: done checking to see if all hosts have failed 11683 1726853250.29992: getting the remaining hosts for this loop 11683 1726853250.29993: done getting the remaining hosts for this loop 11683 1726853250.29998: getting the next task for host managed_node3 11683 1726853250.30004: done getting next task for host managed_node3 11683 1726853250.30007: ^ task is: TASK: Enable EPEL 7 11683 1726853250.30011: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.30015: getting variables 11683 1726853250.30017: in VariableManager get_vars() 11683 1726853250.30043: Calling all_inventory to load vars for managed_node3 11683 1726853250.30046: Calling groups_inventory to load vars for managed_node3 11683 1726853250.30050: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.30183: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.30187: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.30190: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.30335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.30859: done with get_vars() 11683 1726853250.30993: done getting variables 11683 1726853250.31060: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 13:27:30 -0400 (0:00:00.048) 0:00:03.383 ****** 11683 1726853250.31152: entering _queue_task() for managed_node3/command 11683 1726853250.31841: worker is 1 (out of 1 available) 11683 1726853250.31854: exiting _queue_task() for managed_node3/command 11683 1726853250.32107: done queuing things up, now waiting for results queue to drain 11683 1726853250.32109: waiting for pending results... 11683 1726853250.32417: running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 11683 1726853250.32575: in run() - task 02083763-bbaf-c5b2-e075-0000000000ff 11683 1726853250.32600: variable 'ansible_search_path' from source: unknown 11683 1726853250.32609: variable 'ansible_search_path' from source: unknown 11683 1726853250.32660: calling self._execute() 11683 1726853250.32755: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.32760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.32779: variable 'omit' from source: magic vars 11683 1726853250.33211: variable 'ansible_distribution' from source: facts 11683 1726853250.33247: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11683 1726853250.33377: variable 'ansible_distribution_major_version' from source: facts 11683 1726853250.33401: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11683 1726853250.33404: when evaluation is False, skipping this task 11683 1726853250.33406: _execute() done 11683 1726853250.33494: dumping result to json 11683 1726853250.33497: done dumping result, returning 11683 1726853250.33500: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 7 [02083763-bbaf-c5b2-e075-0000000000ff] 11683 1726853250.33502: sending task result for task 02083763-bbaf-c5b2-e075-0000000000ff skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11683 1726853250.33656: no more pending results, returning what we have 11683 1726853250.33660: results queue empty 11683 1726853250.33660: checking for any_errors_fatal 11683 1726853250.33668: done checking for any_errors_fatal 11683 1726853250.33669: checking for max_fail_percentage 11683 1726853250.33672: done checking for max_fail_percentage 11683 1726853250.33673: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.33675: done checking to see if all hosts have failed 11683 1726853250.33675: getting the remaining hosts for this loop 11683 1726853250.33677: done getting the remaining hosts for this loop 11683 1726853250.33680: getting the next task for host managed_node3 11683 1726853250.33687: done getting next task for host managed_node3 11683 1726853250.33689: ^ task is: TASK: Enable EPEL 8 11683 1726853250.33693: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.33698: getting variables 11683 1726853250.33700: in VariableManager get_vars() 11683 1726853250.33731: Calling all_inventory to load vars for managed_node3 11683 1726853250.33734: Calling groups_inventory to load vars for managed_node3 11683 1726853250.33737: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.33750: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.33754: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.33757: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.34439: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000ff 11683 1726853250.34443: WORKER PROCESS EXITING 11683 1726853250.34497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.35400: done with get_vars() 11683 1726853250.35410: done getting variables 11683 1726853250.35525: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 13:27:30 -0400 (0:00:00.045) 0:00:03.428 ****** 11683 1726853250.35683: entering _queue_task() for managed_node3/command 11683 1726853250.36794: worker is 1 (out of 1 available) 11683 1726853250.36808: exiting _queue_task() for managed_node3/command 11683 1726853250.36819: done queuing things up, now waiting for results queue to drain 11683 1726853250.36820: waiting for pending results... 11683 1726853250.37087: running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 11683 1726853250.37378: in run() - task 02083763-bbaf-c5b2-e075-000000000100 11683 1726853250.37382: variable 'ansible_search_path' from source: unknown 11683 1726853250.37384: variable 'ansible_search_path' from source: unknown 11683 1726853250.37386: calling self._execute() 11683 1726853250.37612: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.37628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.37644: variable 'omit' from source: magic vars 11683 1726853250.38482: variable 'ansible_distribution' from source: facts 11683 1726853250.38777: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11683 1726853250.38839: variable 'ansible_distribution_major_version' from source: facts 11683 1726853250.38850: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11683 1726853250.38858: when evaluation is False, skipping this task 11683 1726853250.38864: _execute() done 11683 1726853250.38870: dumping result to json 11683 1726853250.38880: done dumping result, returning 11683 1726853250.38890: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 8 [02083763-bbaf-c5b2-e075-000000000100] 11683 1726853250.38901: sending task result for task 02083763-bbaf-c5b2-e075-000000000100 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11683 1726853250.39052: no more pending results, returning what we have 11683 1726853250.39055: results queue empty 11683 1726853250.39056: checking for any_errors_fatal 11683 1726853250.39061: done checking for any_errors_fatal 11683 1726853250.39062: checking for max_fail_percentage 11683 1726853250.39064: done checking for max_fail_percentage 11683 1726853250.39065: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.39066: done checking to see if all hosts have failed 11683 1726853250.39067: getting the remaining hosts for this loop 11683 1726853250.39068: done getting the remaining hosts for this loop 11683 1726853250.39079: getting the next task for host managed_node3 11683 1726853250.39091: done getting next task for host managed_node3 11683 1726853250.39093: ^ task is: TASK: Enable EPEL 6 11683 1726853250.39097: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.39102: getting variables 11683 1726853250.39104: in VariableManager get_vars() 11683 1726853250.39137: Calling all_inventory to load vars for managed_node3 11683 1726853250.39140: Calling groups_inventory to load vars for managed_node3 11683 1726853250.39144: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.39158: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.39161: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.39164: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.39900: done sending task result for task 02083763-bbaf-c5b2-e075-000000000100 11683 1726853250.39903: WORKER PROCESS EXITING 11683 1726853250.39926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.40280: done with get_vars() 11683 1726853250.40290: done getting variables 11683 1726853250.40347: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 13:27:30 -0400 (0:00:00.048) 0:00:03.476 ****** 11683 1726853250.40494: entering _queue_task() for managed_node3/copy 11683 1726853250.41052: worker is 1 (out of 1 available) 11683 1726853250.41066: exiting _queue_task() for managed_node3/copy 11683 1726853250.41083: done queuing things up, now waiting for results queue to drain 11683 1726853250.41085: waiting for pending results... 11683 1726853250.41457: running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 11683 1726853250.41606: in run() - task 02083763-bbaf-c5b2-e075-000000000102 11683 1726853250.41625: variable 'ansible_search_path' from source: unknown 11683 1726853250.41635: variable 'ansible_search_path' from source: unknown 11683 1726853250.41673: calling self._execute() 11683 1726853250.41758: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.41769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.41785: variable 'omit' from source: magic vars 11683 1726853250.42144: variable 'ansible_distribution' from source: facts 11683 1726853250.42161: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11683 1726853250.42284: variable 'ansible_distribution_major_version' from source: facts 11683 1726853250.42295: Evaluated conditional (ansible_distribution_major_version == '6'): False 11683 1726853250.42306: when evaluation is False, skipping this task 11683 1726853250.42313: _execute() done 11683 1726853250.42320: dumping result to json 11683 1726853250.42326: done dumping result, returning 11683 1726853250.42337: done running TaskExecutor() for managed_node3/TASK: Enable EPEL 6 [02083763-bbaf-c5b2-e075-000000000102] 11683 1726853250.42348: sending task result for task 02083763-bbaf-c5b2-e075-000000000102 11683 1726853250.42679: done sending task result for task 02083763-bbaf-c5b2-e075-000000000102 11683 1726853250.42682: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11683 1726853250.42865: no more pending results, returning what we have 11683 1726853250.42869: results queue empty 11683 1726853250.42870: checking for any_errors_fatal 11683 1726853250.42879: done checking for any_errors_fatal 11683 1726853250.42880: checking for max_fail_percentage 11683 1726853250.42881: done checking for max_fail_percentage 11683 1726853250.42882: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.42883: done checking to see if all hosts have failed 11683 1726853250.42884: getting the remaining hosts for this loop 11683 1726853250.42885: done getting the remaining hosts for this loop 11683 1726853250.42888: getting the next task for host managed_node3 11683 1726853250.42895: done getting next task for host managed_node3 11683 1726853250.42898: ^ task is: TASK: Set network provider to 'nm' 11683 1726853250.42900: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.42904: getting variables 11683 1726853250.42906: in VariableManager get_vars() 11683 1726853250.42935: Calling all_inventory to load vars for managed_node3 11683 1726853250.42937: Calling groups_inventory to load vars for managed_node3 11683 1726853250.42940: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.42951: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.42954: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.42957: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.43404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.43582: done with get_vars() 11683 1726853250.43590: done getting variables 11683 1726853250.43645: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Friday 20 September 2024 13:27:30 -0400 (0:00:00.031) 0:00:03.508 ****** 11683 1726853250.43669: entering _queue_task() for managed_node3/set_fact 11683 1726853250.43924: worker is 1 (out of 1 available) 11683 1726853250.43936: exiting _queue_task() for managed_node3/set_fact 11683 1726853250.43949: done queuing things up, now waiting for results queue to drain 11683 1726853250.43950: waiting for pending results... 11683 1726853250.44209: running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' 11683 1726853250.44311: in run() - task 02083763-bbaf-c5b2-e075-000000000007 11683 1726853250.44333: variable 'ansible_search_path' from source: unknown 11683 1726853250.44380: calling self._execute() 11683 1726853250.44502: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.44506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.44509: variable 'omit' from source: magic vars 11683 1726853250.44598: variable 'omit' from source: magic vars 11683 1726853250.44635: variable 'omit' from source: magic vars 11683 1726853250.44675: variable 'omit' from source: magic vars 11683 1726853250.44727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853250.44976: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853250.44979: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853250.44981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853250.44984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853250.44986: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853250.44988: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.44989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.44991: Set connection var ansible_shell_executable to /bin/sh 11683 1726853250.44993: Set connection var ansible_timeout to 10 11683 1726853250.44995: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853250.44997: Set connection var ansible_pipelining to False 11683 1726853250.44999: Set connection var ansible_shell_type to sh 11683 1726853250.45001: Set connection var ansible_connection to ssh 11683 1726853250.45018: variable 'ansible_shell_executable' from source: unknown 11683 1726853250.45029: variable 'ansible_connection' from source: unknown 11683 1726853250.45036: variable 'ansible_module_compression' from source: unknown 11683 1726853250.45041: variable 'ansible_shell_type' from source: unknown 11683 1726853250.45047: variable 'ansible_shell_executable' from source: unknown 11683 1726853250.45054: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.45061: variable 'ansible_pipelining' from source: unknown 11683 1726853250.45066: variable 'ansible_timeout' from source: unknown 11683 1726853250.45076: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.45220: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853250.45241: variable 'omit' from source: magic vars 11683 1726853250.45251: starting attempt loop 11683 1726853250.45258: running the handler 11683 1726853250.45274: handler run complete 11683 1726853250.45289: attempt loop complete, returning result 11683 1726853250.45295: _execute() done 11683 1726853250.45302: dumping result to json 11683 1726853250.45308: done dumping result, returning 11683 1726853250.45318: done running TaskExecutor() for managed_node3/TASK: Set network provider to 'nm' [02083763-bbaf-c5b2-e075-000000000007] 11683 1726853250.45327: sending task result for task 02083763-bbaf-c5b2-e075-000000000007 ok: [managed_node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11683 1726853250.45495: no more pending results, returning what we have 11683 1726853250.45498: results queue empty 11683 1726853250.45499: checking for any_errors_fatal 11683 1726853250.45504: done checking for any_errors_fatal 11683 1726853250.45505: checking for max_fail_percentage 11683 1726853250.45507: done checking for max_fail_percentage 11683 1726853250.45508: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.45509: done checking to see if all hosts have failed 11683 1726853250.45510: getting the remaining hosts for this loop 11683 1726853250.45512: done getting the remaining hosts for this loop 11683 1726853250.45516: getting the next task for host managed_node3 11683 1726853250.45523: done getting next task for host managed_node3 11683 1726853250.45525: ^ task is: TASK: meta (flush_handlers) 11683 1726853250.45527: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.45531: getting variables 11683 1726853250.45533: in VariableManager get_vars() 11683 1726853250.45564: Calling all_inventory to load vars for managed_node3 11683 1726853250.45567: Calling groups_inventory to load vars for managed_node3 11683 1726853250.45572: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.45584: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.45588: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.45591: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.46002: done sending task result for task 02083763-bbaf-c5b2-e075-000000000007 11683 1726853250.46005: WORKER PROCESS EXITING 11683 1726853250.46027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.46222: done with get_vars() 11683 1726853250.46232: done getting variables 11683 1726853250.46300: in VariableManager get_vars() 11683 1726853250.46308: Calling all_inventory to load vars for managed_node3 11683 1726853250.46311: Calling groups_inventory to load vars for managed_node3 11683 1726853250.46313: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.46317: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.46319: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.46322: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.46454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.46662: done with get_vars() 11683 1726853250.46678: done queuing things up, now waiting for results queue to drain 11683 1726853250.46680: results queue empty 11683 1726853250.46681: checking for any_errors_fatal 11683 1726853250.46683: done checking for any_errors_fatal 11683 1726853250.46684: checking for max_fail_percentage 11683 1726853250.46685: done checking for max_fail_percentage 11683 1726853250.46685: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.46686: done checking to see if all hosts have failed 11683 1726853250.46687: getting the remaining hosts for this loop 11683 1726853250.46688: done getting the remaining hosts for this loop 11683 1726853250.46690: getting the next task for host managed_node3 11683 1726853250.46694: done getting next task for host managed_node3 11683 1726853250.46695: ^ task is: TASK: meta (flush_handlers) 11683 1726853250.46697: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.46705: getting variables 11683 1726853250.46706: in VariableManager get_vars() 11683 1726853250.46714: Calling all_inventory to load vars for managed_node3 11683 1726853250.46716: Calling groups_inventory to load vars for managed_node3 11683 1726853250.46718: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.46722: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.46725: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.46728: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.46853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.47034: done with get_vars() 11683 1726853250.47042: done getting variables 11683 1726853250.47089: in VariableManager get_vars() 11683 1726853250.47096: Calling all_inventory to load vars for managed_node3 11683 1726853250.47098: Calling groups_inventory to load vars for managed_node3 11683 1726853250.47101: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.47106: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.47109: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.47114: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.47250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.47456: done with get_vars() 11683 1726853250.47466: done queuing things up, now waiting for results queue to drain 11683 1726853250.47468: results queue empty 11683 1726853250.47469: checking for any_errors_fatal 11683 1726853250.47473: done checking for any_errors_fatal 11683 1726853250.47473: checking for max_fail_percentage 11683 1726853250.47474: done checking for max_fail_percentage 11683 1726853250.47475: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.47476: done checking to see if all hosts have failed 11683 1726853250.47476: getting the remaining hosts for this loop 11683 1726853250.47477: done getting the remaining hosts for this loop 11683 1726853250.47481: getting the next task for host managed_node3 11683 1726853250.47483: done getting next task for host managed_node3 11683 1726853250.47485: ^ task is: None 11683 1726853250.47486: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.47487: done queuing things up, now waiting for results queue to drain 11683 1726853250.47489: results queue empty 11683 1726853250.47490: checking for any_errors_fatal 11683 1726853250.47491: done checking for any_errors_fatal 11683 1726853250.47491: checking for max_fail_percentage 11683 1726853250.47492: done checking for max_fail_percentage 11683 1726853250.47493: checking to see if all hosts have failed and the running result is not ok 11683 1726853250.47494: done checking to see if all hosts have failed 11683 1726853250.47497: getting the next task for host managed_node3 11683 1726853250.47500: done getting next task for host managed_node3 11683 1726853250.47500: ^ task is: None 11683 1726853250.47502: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.47552: in VariableManager get_vars() 11683 1726853250.47579: done with get_vars() 11683 1726853250.47585: in VariableManager get_vars() 11683 1726853250.47602: done with get_vars() 11683 1726853250.47607: variable 'omit' from source: magic vars 11683 1726853250.47641: in VariableManager get_vars() 11683 1726853250.47658: done with get_vars() 11683 1726853250.47685: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 11683 1726853250.48367: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11683 1726853250.48394: getting the remaining hosts for this loop 11683 1726853250.48397: done getting the remaining hosts for this loop 11683 1726853250.48399: getting the next task for host managed_node3 11683 1726853250.48401: done getting next task for host managed_node3 11683 1726853250.48403: ^ task is: TASK: Gathering Facts 11683 1726853250.48405: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853250.48406: getting variables 11683 1726853250.48407: in VariableManager get_vars() 11683 1726853250.48418: Calling all_inventory to load vars for managed_node3 11683 1726853250.48421: Calling groups_inventory to load vars for managed_node3 11683 1726853250.48423: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853250.48429: Calling all_plugins_play to load vars for managed_node3 11683 1726853250.48444: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853250.48448: Calling groups_plugins_play to load vars for managed_node3 11683 1726853250.48633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853250.49223: done with get_vars() 11683 1726853250.49232: done getting variables 11683 1726853250.49277: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Friday 20 September 2024 13:27:30 -0400 (0:00:00.056) 0:00:03.565 ****** 11683 1726853250.49304: entering _queue_task() for managed_node3/gather_facts 11683 1726853250.49804: worker is 1 (out of 1 available) 11683 1726853250.49817: exiting _queue_task() for managed_node3/gather_facts 11683 1726853250.49829: done queuing things up, now waiting for results queue to drain 11683 1726853250.49832: waiting for pending results... 11683 1726853250.50689: running TaskExecutor() for managed_node3/TASK: Gathering Facts 11683 1726853250.50695: in run() - task 02083763-bbaf-c5b2-e075-000000000128 11683 1726853250.50698: variable 'ansible_search_path' from source: unknown 11683 1726853250.50701: calling self._execute() 11683 1726853250.50836: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.50889: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.50906: variable 'omit' from source: magic vars 11683 1726853250.51767: variable 'ansible_distribution_major_version' from source: facts 11683 1726853250.51790: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853250.51878: variable 'omit' from source: magic vars 11683 1726853250.51889: variable 'omit' from source: magic vars 11683 1726853250.52124: variable 'omit' from source: magic vars 11683 1726853250.52128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853250.52136: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853250.52161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853250.52189: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853250.52206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853250.52248: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853250.52261: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.52269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.52381: Set connection var ansible_shell_executable to /bin/sh 11683 1726853250.52397: Set connection var ansible_timeout to 10 11683 1726853250.52408: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853250.52416: Set connection var ansible_pipelining to False 11683 1726853250.52425: Set connection var ansible_shell_type to sh 11683 1726853250.52437: Set connection var ansible_connection to ssh 11683 1726853250.52557: variable 'ansible_shell_executable' from source: unknown 11683 1726853250.52560: variable 'ansible_connection' from source: unknown 11683 1726853250.52562: variable 'ansible_module_compression' from source: unknown 11683 1726853250.52564: variable 'ansible_shell_type' from source: unknown 11683 1726853250.52566: variable 'ansible_shell_executable' from source: unknown 11683 1726853250.52567: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853250.52569: variable 'ansible_pipelining' from source: unknown 11683 1726853250.52574: variable 'ansible_timeout' from source: unknown 11683 1726853250.52576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853250.52719: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853250.52735: variable 'omit' from source: magic vars 11683 1726853250.52746: starting attempt loop 11683 1726853250.52753: running the handler 11683 1726853250.52780: variable 'ansible_facts' from source: unknown 11683 1726853250.52808: _low_level_execute_command(): starting 11683 1726853250.52822: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853250.53682: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853250.53701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853250.53878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853250.56194: stdout chunk (state=3): >>>/root <<< 11683 1726853250.56455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853250.56461: stdout chunk (state=3): >>><<< 11683 1726853250.56464: stderr chunk (state=3): >>><<< 11683 1726853250.56518: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853250.56630: _low_level_execute_command(): starting 11683 1726853250.56677: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993 `" && echo ansible-tmp-1726853250.5652475-11831-187516698060993="` echo /root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993 `" ) && sleep 0' 11683 1726853250.58053: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853250.58168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853250.58173: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853250.58206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853250.58234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853250.58263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853250.58376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853250.61241: stdout chunk (state=3): >>>ansible-tmp-1726853250.5652475-11831-187516698060993=/root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993 <<< 11683 1726853250.61451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853250.61486: stderr chunk (state=3): >>><<< 11683 1726853250.61510: stdout chunk (state=3): >>><<< 11683 1726853250.61527: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853250.5652475-11831-187516698060993=/root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853250.61677: variable 'ansible_module_compression' from source: unknown 11683 1726853250.61680: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11683 1726853250.61723: variable 'ansible_facts' from source: unknown 11683 1726853250.62094: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/AnsiballZ_setup.py 11683 1726853250.62413: Sending initial data 11683 1726853250.62441: Sent initial data (154 bytes) 11683 1726853250.63108: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853250.63222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853250.63284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853250.63370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853250.65715: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853250.65817: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853250.65995: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpm6mqbujy /root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/AnsiballZ_setup.py <<< 11683 1726853250.66000: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/AnsiballZ_setup.py" <<< 11683 1726853250.66129: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpm6mqbujy" to remote "/root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/AnsiballZ_setup.py" <<< 11683 1726853250.67736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853250.67817: stderr chunk (state=3): >>><<< 11683 1726853250.67820: stdout chunk (state=3): >>><<< 11683 1726853250.67823: done transferring module to remote 11683 1726853250.67829: _low_level_execute_command(): starting 11683 1726853250.67834: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/ /root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/AnsiballZ_setup.py && sleep 0' 11683 1726853250.68411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853250.68414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853250.68417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853250.68419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853250.68422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853250.68486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853250.68579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853250.71186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853250.71212: stderr chunk (state=3): >>><<< 11683 1726853250.71215: stdout chunk (state=3): >>><<< 11683 1726853250.71234: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853250.71237: _low_level_execute_command(): starting 11683 1726853250.71249: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/AnsiballZ_setup.py && sleep 0' 11683 1726853250.71801: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853250.71804: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853250.71836: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853250.71850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853250.71853: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853250.71918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853251.56019: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "31", "epoch": "1726853251", "epoch_int": "1726853251", "date": "2024-09-20", "time": "13:27:31", "iso8601_micro": "2024-09-20T17:27:31.163925Z", "iso8601": "2024-09-20T17:27:31Z", "iso8601_basic": "20240920T132731163925", "iso8601_basic_short": "20240920T132731", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2981, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 550, "free": 2981}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0,<<< 11683 1726853251.56025: stdout chunk (state=3): >>> "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 395, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805608960, "block_size": 4096, "block_total": 65519099, "block_available": 63917385, "block_used": 1601714, "inode_total": 131070960, "inode_available": 131029150, "inode_used": 41810, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.65380859375, "5m": 0.33642578125, "15m": 0.1552734375}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "<<< 11683 1726853251.56063: stdout chunk (state=3): >>>off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansibl<<< 11683 1726853251.56073: stdout chunk (state=3): >>>e_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11683 1726853251.59183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853251.59188: stdout chunk (state=3): >>><<< 11683 1726853251.59191: stderr chunk (state=3): >>><<< 11683 1726853251.59210: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCoYDjPBT+oZMoH+Iz89VMad3pzgkIKqOuUO8QZPyEpVfgGNjVOaglgkXLQyOXulclB6EA4nlBVmVP6IyWL+N4gskjf5Qmm5n+WHu3amXXm9l66v+yKWruV18sgIn8o6iAdCgZrJFzdFVNeKRLYV6P67syyegqRkOJe7U2m/rxA967Vm6wcrwJN8eiZc8tbx1lHOuJNCcP20ThNxMHIPfvT2im8rlt/ojf6e+Z1axRnvFBubBg1qDfxHEX6AlxMHl+CIOXxGHxsvSxfLq7lhrXrComBSxTDv+fmHeJkck3VGck2rn8Hz3eBTty453RU3pq6KxdqU1GB+Z+VYHInXEtXb2i38QpLqRfiLqwUxdjkFKqz0j5z/+NskQF3w8j6uz77Revs9G8eMs14sICJtnD9qBLhFuGxlR/ovZCjynVjBTKBfDaVASpjV0iCIZKSgLn6zSaM/6FBhnBoeb4ch2iiw2S8Q0aKJNKIp0AfY21IVplyS3VCu+Br3fIXzoINo0s=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIX3PaAHBZzPq23oQJkywX/2bB39yz9ZrSLgsNfhL04NHwnY0Up/oiN+aiUte1DWFqV5wiDLJpl9a1tDLARWXNA=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIBb2CS4KVp6KVVqnGA45j7tkSkijXfGxbd3neKpDsjh9", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-11-217", "ansible_nodename": "ip-10-31-11-217.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2a85cf21f17783c9da20681cb8e352", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "13", "minute": "27", "second": "31", "epoch": "1726853251", "epoch_int": "1726853251", "date": "2024-09-20", "time": "13:27:31", "iso8601_micro": "2024-09-20T17:27:31.163925Z", "iso8601": "2024-09-20T17:27:31Z", "iso8601_basic": "20240920T132731163925", "iso8601_basic_short": "20240920T132731", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2981, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 550, "free": 2981}, "nocache": {"free": 3295, "used": 236}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_uuid": "ec2a85cf-21f1-7783-c9da-20681cb8e352", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 395, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261805608960, "block_size": 4096, "block_total": 65519099, "block_available": 63917385, "block_used": 1601714, "inode_total": 131070960, "inode_available": 131029150, "inode_used": 41810, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_apparmor": {"status": "disabled"}, "ansible_loadavg": {"1m": 0.65380859375, "5m": 0.33642578125, "15m": 0.1552734375}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::102a:53ff:fe36:f0e9", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.217", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:2a:53:36:f0:e9", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.217"], "ansible_all_ipv6_addresses": ["fe80::102a:53ff:fe36:f0e9"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.217", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::102a:53ff:fe36:f0e9"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.46.199 55604 10.31.11.217 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.46.199 55604 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853251.59678: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853251.59682: _low_level_execute_command(): starting 11683 1726853251.59684: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853250.5652475-11831-187516698060993/ > /dev/null 2>&1 && sleep 0' 11683 1726853251.60320: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853251.60339: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853251.60356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853251.60449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853251.60483: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853251.60506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853251.60528: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853251.60682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853251.63328: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853251.63357: stderr chunk (state=3): >>><<< 11683 1726853251.63361: stdout chunk (state=3): >>><<< 11683 1726853251.63378: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853251.63385: handler run complete 11683 1726853251.63462: variable 'ansible_facts' from source: unknown 11683 1726853251.63526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853251.63705: variable 'ansible_facts' from source: unknown 11683 1726853251.63818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853251.63893: attempt loop complete, returning result 11683 1726853251.63897: _execute() done 11683 1726853251.63899: dumping result to json 11683 1726853251.63918: done dumping result, returning 11683 1726853251.63989: done running TaskExecutor() for managed_node3/TASK: Gathering Facts [02083763-bbaf-c5b2-e075-000000000128] 11683 1726853251.63992: sending task result for task 02083763-bbaf-c5b2-e075-000000000128 ok: [managed_node3] 11683 1726853251.64525: no more pending results, returning what we have 11683 1726853251.64527: results queue empty 11683 1726853251.64528: checking for any_errors_fatal 11683 1726853251.64529: done checking for any_errors_fatal 11683 1726853251.64529: checking for max_fail_percentage 11683 1726853251.64530: done checking for max_fail_percentage 11683 1726853251.64531: checking to see if all hosts have failed and the running result is not ok 11683 1726853251.64531: done checking to see if all hosts have failed 11683 1726853251.64532: getting the remaining hosts for this loop 11683 1726853251.64533: done getting the remaining hosts for this loop 11683 1726853251.64535: getting the next task for host managed_node3 11683 1726853251.64538: done getting next task for host managed_node3 11683 1726853251.64539: ^ task is: TASK: meta (flush_handlers) 11683 1726853251.64541: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853251.64543: getting variables 11683 1726853251.64544: in VariableManager get_vars() 11683 1726853251.64567: Calling all_inventory to load vars for managed_node3 11683 1726853251.64568: Calling groups_inventory to load vars for managed_node3 11683 1726853251.64570: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853251.64580: Calling all_plugins_play to load vars for managed_node3 11683 1726853251.64582: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853251.64585: Calling groups_plugins_play to load vars for managed_node3 11683 1726853251.64687: done sending task result for task 02083763-bbaf-c5b2-e075-000000000128 11683 1726853251.64690: WORKER PROCESS EXITING 11683 1726853251.64700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853251.64814: done with get_vars() 11683 1726853251.64822: done getting variables 11683 1726853251.64888: in VariableManager get_vars() 11683 1726853251.64898: Calling all_inventory to load vars for managed_node3 11683 1726853251.64900: Calling groups_inventory to load vars for managed_node3 11683 1726853251.64901: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853251.64904: Calling all_plugins_play to load vars for managed_node3 11683 1726853251.64906: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853251.64907: Calling groups_plugins_play to load vars for managed_node3 11683 1726853251.64990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853251.65101: done with get_vars() 11683 1726853251.65110: done queuing things up, now waiting for results queue to drain 11683 1726853251.65111: results queue empty 11683 1726853251.65112: checking for any_errors_fatal 11683 1726853251.65113: done checking for any_errors_fatal 11683 1726853251.65114: checking for max_fail_percentage 11683 1726853251.65118: done checking for max_fail_percentage 11683 1726853251.65119: checking to see if all hosts have failed and the running result is not ok 11683 1726853251.65119: done checking to see if all hosts have failed 11683 1726853251.65119: getting the remaining hosts for this loop 11683 1726853251.65120: done getting the remaining hosts for this loop 11683 1726853251.65122: getting the next task for host managed_node3 11683 1726853251.65124: done getting next task for host managed_node3 11683 1726853251.65126: ^ task is: TASK: INIT Prepare setup 11683 1726853251.65127: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853251.65128: getting variables 11683 1726853251.65129: in VariableManager get_vars() 11683 1726853251.65136: Calling all_inventory to load vars for managed_node3 11683 1726853251.65137: Calling groups_inventory to load vars for managed_node3 11683 1726853251.65139: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853251.65142: Calling all_plugins_play to load vars for managed_node3 11683 1726853251.65143: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853251.65145: Calling groups_plugins_play to load vars for managed_node3 11683 1726853251.65224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853251.65358: done with get_vars() 11683 1726853251.65363: done getting variables 11683 1726853251.65430: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Friday 20 September 2024 13:27:31 -0400 (0:00:01.161) 0:00:04.726 ****** 11683 1726853251.65458: entering _queue_task() for managed_node3/debug 11683 1726853251.65460: Creating lock for debug 11683 1726853251.65712: worker is 1 (out of 1 available) 11683 1726853251.65728: exiting _queue_task() for managed_node3/debug 11683 1726853251.65738: done queuing things up, now waiting for results queue to drain 11683 1726853251.65740: waiting for pending results... 11683 1726853251.65940: running TaskExecutor() for managed_node3/TASK: INIT Prepare setup 11683 1726853251.66002: in run() - task 02083763-bbaf-c5b2-e075-00000000000b 11683 1726853251.66014: variable 'ansible_search_path' from source: unknown 11683 1726853251.66042: calling self._execute() 11683 1726853251.66119: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853251.66123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853251.66132: variable 'omit' from source: magic vars 11683 1726853251.66449: variable 'ansible_distribution_major_version' from source: facts 11683 1726853251.66467: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853251.66473: variable 'omit' from source: magic vars 11683 1726853251.66488: variable 'omit' from source: magic vars 11683 1726853251.66546: variable 'omit' from source: magic vars 11683 1726853251.66560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853251.66658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853251.66661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853251.66664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853251.66690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853251.66758: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853251.66763: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853251.66772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853251.66835: Set connection var ansible_shell_executable to /bin/sh 11683 1726853251.66844: Set connection var ansible_timeout to 10 11683 1726853251.66853: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853251.66867: Set connection var ansible_pipelining to False 11683 1726853251.66872: Set connection var ansible_shell_type to sh 11683 1726853251.66875: Set connection var ansible_connection to ssh 11683 1726853251.66898: variable 'ansible_shell_executable' from source: unknown 11683 1726853251.66901: variable 'ansible_connection' from source: unknown 11683 1726853251.66903: variable 'ansible_module_compression' from source: unknown 11683 1726853251.66906: variable 'ansible_shell_type' from source: unknown 11683 1726853251.66908: variable 'ansible_shell_executable' from source: unknown 11683 1726853251.66910: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853251.66912: variable 'ansible_pipelining' from source: unknown 11683 1726853251.66914: variable 'ansible_timeout' from source: unknown 11683 1726853251.66916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853251.67033: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853251.67041: variable 'omit' from source: magic vars 11683 1726853251.67049: starting attempt loop 11683 1726853251.67051: running the handler 11683 1726853251.67168: handler run complete 11683 1726853251.67173: attempt loop complete, returning result 11683 1726853251.67175: _execute() done 11683 1726853251.67177: dumping result to json 11683 1726853251.67181: done dumping result, returning 11683 1726853251.67183: done running TaskExecutor() for managed_node3/TASK: INIT Prepare setup [02083763-bbaf-c5b2-e075-00000000000b] 11683 1726853251.67185: sending task result for task 02083763-bbaf-c5b2-e075-00000000000b 11683 1726853251.67266: done sending task result for task 02083763-bbaf-c5b2-e075-00000000000b 11683 1726853251.67268: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: ################################################## 11683 1726853251.67324: no more pending results, returning what we have 11683 1726853251.67327: results queue empty 11683 1726853251.67328: checking for any_errors_fatal 11683 1726853251.67330: done checking for any_errors_fatal 11683 1726853251.67335: checking for max_fail_percentage 11683 1726853251.67338: done checking for max_fail_percentage 11683 1726853251.67340: checking to see if all hosts have failed and the running result is not ok 11683 1726853251.67341: done checking to see if all hosts have failed 11683 1726853251.67342: getting the remaining hosts for this loop 11683 1726853251.67343: done getting the remaining hosts for this loop 11683 1726853251.67348: getting the next task for host managed_node3 11683 1726853251.67353: done getting next task for host managed_node3 11683 1726853251.67359: ^ task is: TASK: Install dnsmasq 11683 1726853251.67361: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853251.67367: getting variables 11683 1726853251.67369: in VariableManager get_vars() 11683 1726853251.67411: Calling all_inventory to load vars for managed_node3 11683 1726853251.67413: Calling groups_inventory to load vars for managed_node3 11683 1726853251.67415: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853251.67425: Calling all_plugins_play to load vars for managed_node3 11683 1726853251.67427: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853251.67430: Calling groups_plugins_play to load vars for managed_node3 11683 1726853251.67592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853251.67753: done with get_vars() 11683 1726853251.67766: done getting variables 11683 1726853251.67814: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:27:31 -0400 (0:00:00.023) 0:00:04.750 ****** 11683 1726853251.67850: entering _queue_task() for managed_node3/package 11683 1726853251.68090: worker is 1 (out of 1 available) 11683 1726853251.68102: exiting _queue_task() for managed_node3/package 11683 1726853251.68114: done queuing things up, now waiting for results queue to drain 11683 1726853251.68115: waiting for pending results... 11683 1726853251.68382: running TaskExecutor() for managed_node3/TASK: Install dnsmasq 11683 1726853251.68476: in run() - task 02083763-bbaf-c5b2-e075-00000000000f 11683 1726853251.68488: variable 'ansible_search_path' from source: unknown 11683 1726853251.68491: variable 'ansible_search_path' from source: unknown 11683 1726853251.68520: calling self._execute() 11683 1726853251.68586: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853251.68615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853251.68618: variable 'omit' from source: magic vars 11683 1726853251.68968: variable 'ansible_distribution_major_version' from source: facts 11683 1726853251.68993: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853251.69029: variable 'omit' from source: magic vars 11683 1726853251.69075: variable 'omit' from source: magic vars 11683 1726853251.69467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853251.71135: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853251.71198: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853251.71225: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853251.71257: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853251.71274: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853251.71346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853251.71368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853251.71386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853251.71411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853251.71423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853251.71500: variable '__network_is_ostree' from source: set_fact 11683 1726853251.71504: variable 'omit' from source: magic vars 11683 1726853251.71526: variable 'omit' from source: magic vars 11683 1726853251.71550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853251.71588: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853251.71591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853251.71615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853251.71618: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853251.71647: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853251.71667: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853251.71670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853251.71781: Set connection var ansible_shell_executable to /bin/sh 11683 1726853251.71785: Set connection var ansible_timeout to 10 11683 1726853251.71787: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853251.71790: Set connection var ansible_pipelining to False 11683 1726853251.71792: Set connection var ansible_shell_type to sh 11683 1726853251.71794: Set connection var ansible_connection to ssh 11683 1726853251.71885: variable 'ansible_shell_executable' from source: unknown 11683 1726853251.71888: variable 'ansible_connection' from source: unknown 11683 1726853251.71890: variable 'ansible_module_compression' from source: unknown 11683 1726853251.71892: variable 'ansible_shell_type' from source: unknown 11683 1726853251.71895: variable 'ansible_shell_executable' from source: unknown 11683 1726853251.71896: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853251.71898: variable 'ansible_pipelining' from source: unknown 11683 1726853251.71901: variable 'ansible_timeout' from source: unknown 11683 1726853251.71903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853251.71932: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853251.71950: variable 'omit' from source: magic vars 11683 1726853251.71953: starting attempt loop 11683 1726853251.71968: running the handler 11683 1726853251.71973: variable 'ansible_facts' from source: unknown 11683 1726853251.71981: variable 'ansible_facts' from source: unknown 11683 1726853251.72079: _low_level_execute_command(): starting 11683 1726853251.72082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853251.72620: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853251.72657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853251.72700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853251.72720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853251.72797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853251.75222: stdout chunk (state=3): >>>/root <<< 11683 1726853251.75389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853251.75416: stderr chunk (state=3): >>><<< 11683 1726853251.75419: stdout chunk (state=3): >>><<< 11683 1726853251.75443: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853251.75455: _low_level_execute_command(): starting 11683 1726853251.75459: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006 `" && echo ansible-tmp-1726853251.7544188-11876-24369948533006="` echo /root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006 `" ) && sleep 0' 11683 1726853251.75975: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853251.75979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853251.75981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853251.75983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853251.75986: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853251.76000: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853251.76026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853251.76042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853251.76110: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853251.78856: stdout chunk (state=3): >>>ansible-tmp-1726853251.7544188-11876-24369948533006=/root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006 <<< 11683 1726853251.79026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853251.79058: stderr chunk (state=3): >>><<< 11683 1726853251.79061: stdout chunk (state=3): >>><<< 11683 1726853251.79079: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853251.7544188-11876-24369948533006=/root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853251.79107: variable 'ansible_module_compression' from source: unknown 11683 1726853251.79172: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11683 1726853251.79176: ANSIBALLZ: Acquiring lock 11683 1726853251.79180: ANSIBALLZ: Lock acquired: 139785061355968 11683 1726853251.79182: ANSIBALLZ: Creating module 11683 1726853251.91377: ANSIBALLZ: Writing module into payload 11683 1726853251.91381: ANSIBALLZ: Writing module 11683 1726853251.91384: ANSIBALLZ: Renaming module 11683 1726853251.91387: ANSIBALLZ: Done creating module 11683 1726853251.91391: variable 'ansible_facts' from source: unknown 11683 1726853251.91486: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/AnsiballZ_dnf.py 11683 1726853251.91800: Sending initial data 11683 1726853251.91804: Sent initial data (151 bytes) 11683 1726853251.92393: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853251.92398: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853251.92478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853251.94755: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11683 1726853251.94765: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853251.94819: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853251.94887: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmph1w1m004 /root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/AnsiballZ_dnf.py <<< 11683 1726853251.94891: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/AnsiballZ_dnf.py" <<< 11683 1726853251.94946: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmph1w1m004" to remote "/root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/AnsiballZ_dnf.py" <<< 11683 1726853251.94953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/AnsiballZ_dnf.py" <<< 11683 1726853251.95714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853251.95754: stderr chunk (state=3): >>><<< 11683 1726853251.95757: stdout chunk (state=3): >>><<< 11683 1726853251.95795: done transferring module to remote 11683 1726853251.95804: _low_level_execute_command(): starting 11683 1726853251.95811: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/ /root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/AnsiballZ_dnf.py && sleep 0' 11683 1726853251.96236: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853251.96240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853251.96250: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11683 1726853251.96265: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853251.96360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853251.96425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853251.98989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853251.99016: stderr chunk (state=3): >>><<< 11683 1726853251.99019: stdout chunk (state=3): >>><<< 11683 1726853251.99033: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853251.99037: _low_level_execute_command(): starting 11683 1726853251.99039: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/AnsiballZ_dnf.py && sleep 0' 11683 1726853251.99669: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853251.99674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853251.99683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853251.99686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853251.99689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853251.99743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853251.99751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853251.99835: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853252.60465: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11683 1726853252.67379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853252.67384: stdout chunk (state=3): >>><<< 11683 1726853252.67386: stderr chunk (state=3): >>><<< 11683 1726853252.67389: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853252.67395: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853252.67397: _low_level_execute_command(): starting 11683 1726853252.67399: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853251.7544188-11876-24369948533006/ > /dev/null 2>&1 && sleep 0' 11683 1726853252.68002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853252.68018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853252.68031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853252.68077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 11683 1726853252.68091: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853252.68186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853252.68210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853252.68314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853252.70949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853252.70969: stderr chunk (state=3): >>><<< 11683 1726853252.70974: stdout chunk (state=3): >>><<< 11683 1726853252.71010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853252.71014: handler run complete 11683 1726853252.71207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853252.71406: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853252.71451: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853252.71508: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853252.71530: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853252.71617: variable '__install_status' from source: unknown 11683 1726853252.71725: Evaluated conditional (__install_status is success): True 11683 1726853252.71728: attempt loop complete, returning result 11683 1726853252.71731: _execute() done 11683 1726853252.71733: dumping result to json 11683 1726853252.71735: done dumping result, returning 11683 1726853252.71737: done running TaskExecutor() for managed_node3/TASK: Install dnsmasq [02083763-bbaf-c5b2-e075-00000000000f] 11683 1726853252.71739: sending task result for task 02083763-bbaf-c5b2-e075-00000000000f ok: [managed_node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11683 1726853252.71943: no more pending results, returning what we have 11683 1726853252.71949: results queue empty 11683 1726853252.71950: checking for any_errors_fatal 11683 1726853252.71977: done checking for any_errors_fatal 11683 1726853252.71978: checking for max_fail_percentage 11683 1726853252.71980: done checking for max_fail_percentage 11683 1726853252.71981: checking to see if all hosts have failed and the running result is not ok 11683 1726853252.71982: done checking to see if all hosts have failed 11683 1726853252.71983: getting the remaining hosts for this loop 11683 1726853252.71985: done getting the remaining hosts for this loop 11683 1726853252.71988: getting the next task for host managed_node3 11683 1726853252.71993: done getting next task for host managed_node3 11683 1726853252.71995: ^ task is: TASK: Install pgrep, sysctl 11683 1726853252.71997: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853252.72001: getting variables 11683 1726853252.72003: in VariableManager get_vars() 11683 1726853252.72041: Calling all_inventory to load vars for managed_node3 11683 1726853252.72047: Calling groups_inventory to load vars for managed_node3 11683 1726853252.72050: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853252.72061: Calling all_plugins_play to load vars for managed_node3 11683 1726853252.72070: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853252.72075: Calling groups_plugins_play to load vars for managed_node3 11683 1726853252.72276: done sending task result for task 02083763-bbaf-c5b2-e075-00000000000f 11683 1726853252.72285: WORKER PROCESS EXITING 11683 1726853252.72305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853252.72477: done with get_vars() 11683 1726853252.72485: done getting variables 11683 1726853252.72554: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 13:27:32 -0400 (0:00:01.047) 0:00:05.797 ****** 11683 1726853252.72589: entering _queue_task() for managed_node3/package 11683 1726853252.72875: worker is 1 (out of 1 available) 11683 1726853252.72888: exiting _queue_task() for managed_node3/package 11683 1726853252.72901: done queuing things up, now waiting for results queue to drain 11683 1726853252.72903: waiting for pending results... 11683 1726853252.73150: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11683 1726853252.73264: in run() - task 02083763-bbaf-c5b2-e075-000000000010 11683 1726853252.73288: variable 'ansible_search_path' from source: unknown 11683 1726853252.73474: variable 'ansible_search_path' from source: unknown 11683 1726853252.73478: calling self._execute() 11683 1726853252.73481: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853252.73491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853252.73495: variable 'omit' from source: magic vars 11683 1726853252.73793: variable 'ansible_distribution_major_version' from source: facts 11683 1726853252.73807: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853252.74060: variable 'ansible_os_family' from source: facts 11683 1726853252.74064: Evaluated conditional (ansible_os_family == 'RedHat'): True 11683 1726853252.74175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853252.74463: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853252.74520: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853252.74595: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853252.74640: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853252.74722: variable 'ansible_distribution_major_version' from source: facts 11683 1726853252.74748: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11683 1726853252.74756: when evaluation is False, skipping this task 11683 1726853252.74763: _execute() done 11683 1726853252.74769: dumping result to json 11683 1726853252.74777: done dumping result, returning 11683 1726853252.74785: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [02083763-bbaf-c5b2-e075-000000000010] 11683 1726853252.74792: sending task result for task 02083763-bbaf-c5b2-e075-000000000010 skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11683 1726853252.74938: no more pending results, returning what we have 11683 1726853252.74942: results queue empty 11683 1726853252.74943: checking for any_errors_fatal 11683 1726853252.74948: done checking for any_errors_fatal 11683 1726853252.74949: checking for max_fail_percentage 11683 1726853252.74951: done checking for max_fail_percentage 11683 1726853252.74952: checking to see if all hosts have failed and the running result is not ok 11683 1726853252.74953: done checking to see if all hosts have failed 11683 1726853252.74954: getting the remaining hosts for this loop 11683 1726853252.74956: done getting the remaining hosts for this loop 11683 1726853252.74960: getting the next task for host managed_node3 11683 1726853252.74966: done getting next task for host managed_node3 11683 1726853252.74968: ^ task is: TASK: Install pgrep, sysctl 11683 1726853252.74972: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853252.74975: getting variables 11683 1726853252.74977: in VariableManager get_vars() 11683 1726853252.75014: Calling all_inventory to load vars for managed_node3 11683 1726853252.75017: Calling groups_inventory to load vars for managed_node3 11683 1726853252.75019: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853252.75031: Calling all_plugins_play to load vars for managed_node3 11683 1726853252.75033: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853252.75036: Calling groups_plugins_play to load vars for managed_node3 11683 1726853252.75226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853252.75440: done with get_vars() 11683 1726853252.75454: done getting variables 11683 1726853252.75526: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 13:27:32 -0400 (0:00:00.029) 0:00:05.827 ****** 11683 1726853252.75562: entering _queue_task() for managed_node3/package 11683 1726853252.75966: done sending task result for task 02083763-bbaf-c5b2-e075-000000000010 11683 1726853252.76004: WORKER PROCESS EXITING 11683 1726853252.75994: worker is 1 (out of 1 available) 11683 1726853252.76054: exiting _queue_task() for managed_node3/package 11683 1726853252.76063: done queuing things up, now waiting for results queue to drain 11683 1726853252.76064: waiting for pending results... 11683 1726853252.76232: running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl 11683 1726853252.76362: in run() - task 02083763-bbaf-c5b2-e075-000000000011 11683 1726853252.76392: variable 'ansible_search_path' from source: unknown 11683 1726853252.76402: variable 'ansible_search_path' from source: unknown 11683 1726853252.76439: calling self._execute() 11683 1726853252.76525: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853252.76534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853252.76548: variable 'omit' from source: magic vars 11683 1726853252.77062: variable 'ansible_distribution_major_version' from source: facts 11683 1726853252.77080: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853252.77209: variable 'ansible_os_family' from source: facts 11683 1726853252.77468: Evaluated conditional (ansible_os_family == 'RedHat'): True 11683 1726853252.77782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853252.78404: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853252.78460: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853252.78501: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853252.78581: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853252.78816: variable 'ansible_distribution_major_version' from source: facts 11683 1726853252.78838: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11683 1726853252.78851: variable 'omit' from source: magic vars 11683 1726853252.78906: variable 'omit' from source: magic vars 11683 1726853252.79078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853252.81092: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853252.81194: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853252.81202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853252.81873: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853252.81877: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853252.82032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853252.82118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853252.82276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853252.82279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853252.82305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853252.82579: variable '__network_is_ostree' from source: set_fact 11683 1726853252.82590: variable 'omit' from source: magic vars 11683 1726853252.82626: variable 'omit' from source: magic vars 11683 1726853252.82656: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853252.82714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853252.82775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853252.82910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853252.82926: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853252.82959: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853252.82968: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853252.82980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853252.83089: Set connection var ansible_shell_executable to /bin/sh 11683 1726853252.83249: Set connection var ansible_timeout to 10 11683 1726853252.83262: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853252.83273: Set connection var ansible_pipelining to False 11683 1726853252.83281: Set connection var ansible_shell_type to sh 11683 1726853252.83304: Set connection var ansible_connection to ssh 11683 1726853252.83333: variable 'ansible_shell_executable' from source: unknown 11683 1726853252.83350: variable 'ansible_connection' from source: unknown 11683 1726853252.83560: variable 'ansible_module_compression' from source: unknown 11683 1726853252.83564: variable 'ansible_shell_type' from source: unknown 11683 1726853252.83566: variable 'ansible_shell_executable' from source: unknown 11683 1726853252.83568: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853252.83570: variable 'ansible_pipelining' from source: unknown 11683 1726853252.83573: variable 'ansible_timeout' from source: unknown 11683 1726853252.83576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853252.83630: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853252.83668: variable 'omit' from source: magic vars 11683 1726853252.83673: starting attempt loop 11683 1726853252.83676: running the handler 11683 1726853252.83678: variable 'ansible_facts' from source: unknown 11683 1726853252.83680: variable 'ansible_facts' from source: unknown 11683 1726853252.83718: _low_level_execute_command(): starting 11683 1726853252.83778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853252.85292: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853252.85344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853252.85454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853252.85468: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853252.85564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853252.87679: stdout chunk (state=3): >>>/root <<< 11683 1726853252.88068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853252.88073: stdout chunk (state=3): >>><<< 11683 1726853252.88076: stderr chunk (state=3): >>><<< 11683 1726853252.88079: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853252.88082: _low_level_execute_command(): starting 11683 1726853252.88084: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241 `" && echo ansible-tmp-1726853252.879805-11935-95179589475241="` echo /root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241 `" ) && sleep 0' 11683 1726853252.89103: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853252.89215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853252.89235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853252.89256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853252.89455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853252.89492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853252.89512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853252.89581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853252.89615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11683 1726853252.92534: stdout chunk (state=3): >>>ansible-tmp-1726853252.879805-11935-95179589475241=/root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241 <<< 11683 1726853252.92538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853252.92543: stdout chunk (state=3): >>><<< 11683 1726853252.92550: stderr chunk (state=3): >>><<< 11683 1726853252.92568: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853252.879805-11935-95179589475241=/root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11683 1726853252.92806: variable 'ansible_module_compression' from source: unknown 11683 1726853252.92864: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11683 1726853252.93176: variable 'ansible_facts' from source: unknown 11683 1726853252.93306: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/AnsiballZ_dnf.py 11683 1726853252.93703: Sending initial data 11683 1726853252.93706: Sent initial data (150 bytes) 11683 1726853252.94697: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853252.94712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853252.94802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853252.96493: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853252.96545: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853252.96603: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmposk6sugv /root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/AnsiballZ_dnf.py <<< 11683 1726853252.96606: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/AnsiballZ_dnf.py" <<< 11683 1726853252.96675: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11683 1726853252.96690: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmposk6sugv" to remote "/root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/AnsiballZ_dnf.py" <<< 11683 1726853252.98056: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853252.98087: stderr chunk (state=3): >>><<< 11683 1726853252.98090: stdout chunk (state=3): >>><<< 11683 1726853252.98113: done transferring module to remote 11683 1726853252.98124: _low_level_execute_command(): starting 11683 1726853252.98129: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/ /root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/AnsiballZ_dnf.py && sleep 0' 11683 1726853252.98794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853252.98846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853252.98868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853252.98880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853252.99000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853253.00929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853253.00961: stdout chunk (state=3): >>><<< 11683 1726853253.00964: stderr chunk (state=3): >>><<< 11683 1726853253.00983: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853253.01068: _low_level_execute_command(): starting 11683 1726853253.01074: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/AnsiballZ_dnf.py && sleep 0' 11683 1726853253.01619: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853253.01696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853253.01747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853253.01764: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853253.01790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853253.01898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853253.44909: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11683 1726853253.49391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853253.49395: stdout chunk (state=3): >>><<< 11683 1726853253.49398: stderr chunk (state=3): >>><<< 11683 1726853253.49478: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853253.49491: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853253.49495: _low_level_execute_command(): starting 11683 1726853253.49510: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853252.879805-11935-95179589475241/ > /dev/null 2>&1 && sleep 0' 11683 1726853253.50167: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853253.50266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853253.50294: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853253.50396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853253.52308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853253.52335: stdout chunk (state=3): >>><<< 11683 1726853253.52338: stderr chunk (state=3): >>><<< 11683 1726853253.52441: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853253.52444: handler run complete 11683 1726853253.52447: attempt loop complete, returning result 11683 1726853253.52449: _execute() done 11683 1726853253.52451: dumping result to json 11683 1726853253.52453: done dumping result, returning 11683 1726853253.52455: done running TaskExecutor() for managed_node3/TASK: Install pgrep, sysctl [02083763-bbaf-c5b2-e075-000000000011] 11683 1726853253.52456: sending task result for task 02083763-bbaf-c5b2-e075-000000000011 ok: [managed_node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11683 1726853253.52644: no more pending results, returning what we have 11683 1726853253.52648: results queue empty 11683 1726853253.52649: checking for any_errors_fatal 11683 1726853253.52657: done checking for any_errors_fatal 11683 1726853253.52658: checking for max_fail_percentage 11683 1726853253.52659: done checking for max_fail_percentage 11683 1726853253.52660: checking to see if all hosts have failed and the running result is not ok 11683 1726853253.52662: done checking to see if all hosts have failed 11683 1726853253.52662: getting the remaining hosts for this loop 11683 1726853253.52664: done getting the remaining hosts for this loop 11683 1726853253.52668: getting the next task for host managed_node3 11683 1726853253.52677: done getting next task for host managed_node3 11683 1726853253.52679: ^ task is: TASK: Create test interfaces 11683 1726853253.52682: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853253.52685: getting variables 11683 1726853253.52687: in VariableManager get_vars() 11683 1726853253.52727: Calling all_inventory to load vars for managed_node3 11683 1726853253.52730: Calling groups_inventory to load vars for managed_node3 11683 1726853253.52732: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853253.52744: Calling all_plugins_play to load vars for managed_node3 11683 1726853253.52747: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853253.52751: Calling groups_plugins_play to load vars for managed_node3 11683 1726853253.53212: done sending task result for task 02083763-bbaf-c5b2-e075-000000000011 11683 1726853253.53215: WORKER PROCESS EXITING 11683 1726853253.53227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853253.53433: done with get_vars() 11683 1726853253.53445: done getting variables 11683 1726853253.53607: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 13:27:33 -0400 (0:00:00.780) 0:00:06.608 ****** 11683 1726853253.53643: entering _queue_task() for managed_node3/shell 11683 1726853253.53645: Creating lock for shell 11683 1726853253.54206: worker is 1 (out of 1 available) 11683 1726853253.54216: exiting _queue_task() for managed_node3/shell 11683 1726853253.54225: done queuing things up, now waiting for results queue to drain 11683 1726853253.54227: waiting for pending results... 11683 1726853253.54312: running TaskExecutor() for managed_node3/TASK: Create test interfaces 11683 1726853253.54434: in run() - task 02083763-bbaf-c5b2-e075-000000000012 11683 1726853253.54464: variable 'ansible_search_path' from source: unknown 11683 1726853253.54474: variable 'ansible_search_path' from source: unknown 11683 1726853253.54518: calling self._execute() 11683 1726853253.54606: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853253.54617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853253.54634: variable 'omit' from source: magic vars 11683 1726853253.55029: variable 'ansible_distribution_major_version' from source: facts 11683 1726853253.55047: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853253.55059: variable 'omit' from source: magic vars 11683 1726853253.55116: variable 'omit' from source: magic vars 11683 1726853253.55522: variable 'dhcp_interface1' from source: play vars 11683 1726853253.55541: variable 'dhcp_interface2' from source: play vars 11683 1726853253.55590: variable 'omit' from source: magic vars 11683 1726853253.55652: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853253.55783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853253.55882: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853253.55885: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853253.55887: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853253.55983: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853253.56015: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853253.56022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853253.56177: Set connection var ansible_shell_executable to /bin/sh 11683 1726853253.56410: Set connection var ansible_timeout to 10 11683 1726853253.56414: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853253.56416: Set connection var ansible_pipelining to False 11683 1726853253.56418: Set connection var ansible_shell_type to sh 11683 1726853253.56420: Set connection var ansible_connection to ssh 11683 1726853253.56422: variable 'ansible_shell_executable' from source: unknown 11683 1726853253.56424: variable 'ansible_connection' from source: unknown 11683 1726853253.56426: variable 'ansible_module_compression' from source: unknown 11683 1726853253.56428: variable 'ansible_shell_type' from source: unknown 11683 1726853253.56430: variable 'ansible_shell_executable' from source: unknown 11683 1726853253.56432: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853253.56434: variable 'ansible_pipelining' from source: unknown 11683 1726853253.56435: variable 'ansible_timeout' from source: unknown 11683 1726853253.56438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853253.56676: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853253.56848: variable 'omit' from source: magic vars 11683 1726853253.56858: starting attempt loop 11683 1726853253.56870: running the handler 11683 1726853253.56885: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853253.56906: _low_level_execute_command(): starting 11683 1726853253.56917: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853253.58435: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853253.58493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853253.58509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853253.58538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853253.58826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853253.60524: stdout chunk (state=3): >>>/root <<< 11683 1726853253.60803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853253.60806: stdout chunk (state=3): >>><<< 11683 1726853253.60808: stderr chunk (state=3): >>><<< 11683 1726853253.60812: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853253.60814: _low_level_execute_command(): starting 11683 1726853253.60817: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986 `" && echo ansible-tmp-1726853253.6070204-11969-92013448364986="` echo /root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986 `" ) && sleep 0' 11683 1726853253.61924: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853253.62028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853253.62032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853253.62065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853253.62488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853253.64327: stdout chunk (state=3): >>>ansible-tmp-1726853253.6070204-11969-92013448364986=/root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986 <<< 11683 1726853253.64479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853253.64482: stdout chunk (state=3): >>><<< 11683 1726853253.64493: stderr chunk (state=3): >>><<< 11683 1726853253.64578: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853253.6070204-11969-92013448364986=/root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853253.64583: variable 'ansible_module_compression' from source: unknown 11683 1726853253.64624: ANSIBALLZ: Using generic lock for ansible.legacy.command 11683 1726853253.64632: ANSIBALLZ: Acquiring lock 11683 1726853253.64640: ANSIBALLZ: Lock acquired: 139785061355968 11683 1726853253.64648: ANSIBALLZ: Creating module 11683 1726853253.83910: ANSIBALLZ: Writing module into payload 11683 1726853253.84010: ANSIBALLZ: Writing module 11683 1726853253.84043: ANSIBALLZ: Renaming module 11683 1726853253.84063: ANSIBALLZ: Done creating module 11683 1726853253.84137: variable 'ansible_facts' from source: unknown 11683 1726853253.84168: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/AnsiballZ_command.py 11683 1726853253.84380: Sending initial data 11683 1726853253.84384: Sent initial data (155 bytes) 11683 1726853253.84960: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853253.84979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853253.84995: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853253.85102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853253.85128: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853253.85146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853253.85249: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853253.86901: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853253.86995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853253.87059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpqzjnbadp /root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/AnsiballZ_command.py <<< 11683 1726853253.87077: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/AnsiballZ_command.py" <<< 11683 1726853253.87149: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpqzjnbadp" to remote "/root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/AnsiballZ_command.py" <<< 11683 1726853253.88187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853253.88216: stderr chunk (state=3): >>><<< 11683 1726853253.88227: stdout chunk (state=3): >>><<< 11683 1726853253.88341: done transferring module to remote 11683 1726853253.88344: _low_level_execute_command(): starting 11683 1726853253.88347: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/ /root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/AnsiballZ_command.py && sleep 0' 11683 1726853253.88901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853253.88989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853253.89030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853253.89046: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853253.89075: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853253.89162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853253.91027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853253.91031: stdout chunk (state=3): >>><<< 11683 1726853253.91039: stderr chunk (state=3): >>><<< 11683 1726853253.91150: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853253.91155: _low_level_execute_command(): starting 11683 1726853253.91168: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/AnsiballZ_command.py && sleep 0' 11683 1726853253.91800: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853253.91803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853253.91812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853253.91976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.30151: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 11683 1726853255.30164: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:27:34.072813", "end": "2024-09-20 13:27:35.299763", "delta": "0:00:01.226950", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853255.31862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853255.31866: stderr chunk (state=3): >>><<< 11683 1726853255.31868: stdout chunk (state=3): >>><<< 11683 1726853255.31898: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 13:27:34.072813", "end": "2024-09-20 13:27:35.299763", "delta": "0:00:01.226950", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853255.31938: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853255.31948: _low_level_execute_command(): starting 11683 1726853255.31951: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853253.6070204-11969-92013448364986/ > /dev/null 2>&1 && sleep 0' 11683 1726853255.32408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853255.32411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.32414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 11683 1726853255.32416: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.32474: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853255.32477: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.32543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.34523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.34547: stderr chunk (state=3): >>><<< 11683 1726853255.34550: stdout chunk (state=3): >>><<< 11683 1726853255.34567: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853255.34778: handler run complete 11683 1726853255.34782: Evaluated conditional (False): False 11683 1726853255.34784: attempt loop complete, returning result 11683 1726853255.34786: _execute() done 11683 1726853255.34788: dumping result to json 11683 1726853255.34790: done dumping result, returning 11683 1726853255.34792: done running TaskExecutor() for managed_node3/TASK: Create test interfaces [02083763-bbaf-c5b2-e075-000000000012] 11683 1726853255.34794: sending task result for task 02083763-bbaf-c5b2-e075-000000000012 11683 1726853255.34868: done sending task result for task 02083763-bbaf-c5b2-e075-000000000012 11683 1726853255.34876: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.226950", "end": "2024-09-20 13:27:35.299763", "rc": 0, "start": "2024-09-20 13:27:34.072813" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11683 1726853255.34963: no more pending results, returning what we have 11683 1726853255.34967: results queue empty 11683 1726853255.34968: checking for any_errors_fatal 11683 1726853255.34977: done checking for any_errors_fatal 11683 1726853255.34978: checking for max_fail_percentage 11683 1726853255.34980: done checking for max_fail_percentage 11683 1726853255.34981: checking to see if all hosts have failed and the running result is not ok 11683 1726853255.34982: done checking to see if all hosts have failed 11683 1726853255.34983: getting the remaining hosts for this loop 11683 1726853255.34985: done getting the remaining hosts for this loop 11683 1726853255.34988: getting the next task for host managed_node3 11683 1726853255.34996: done getting next task for host managed_node3 11683 1726853255.35000: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11683 1726853255.35003: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853255.35006: getting variables 11683 1726853255.35008: in VariableManager get_vars() 11683 1726853255.35050: Calling all_inventory to load vars for managed_node3 11683 1726853255.35053: Calling groups_inventory to load vars for managed_node3 11683 1726853255.35056: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853255.35068: Calling all_plugins_play to load vars for managed_node3 11683 1726853255.35274: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853255.35281: Calling groups_plugins_play to load vars for managed_node3 11683 1726853255.35453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853255.35761: done with get_vars() 11683 1726853255.35775: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:27:35 -0400 (0:00:01.822) 0:00:08.430 ****** 11683 1726853255.35868: entering _queue_task() for managed_node3/include_tasks 11683 1726853255.36211: worker is 1 (out of 1 available) 11683 1726853255.36227: exiting _queue_task() for managed_node3/include_tasks 11683 1726853255.36238: done queuing things up, now waiting for results queue to drain 11683 1726853255.36239: waiting for pending results... 11683 1726853255.36455: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11683 1726853255.36568: in run() - task 02083763-bbaf-c5b2-e075-000000000016 11683 1726853255.36593: variable 'ansible_search_path' from source: unknown 11683 1726853255.36600: variable 'ansible_search_path' from source: unknown 11683 1726853255.36639: calling self._execute() 11683 1726853255.36723: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.36780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.36783: variable 'omit' from source: magic vars 11683 1726853255.37132: variable 'ansible_distribution_major_version' from source: facts 11683 1726853255.37149: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853255.37160: _execute() done 11683 1726853255.37167: dumping result to json 11683 1726853255.37177: done dumping result, returning 11683 1726853255.37188: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-c5b2-e075-000000000016] 11683 1726853255.37197: sending task result for task 02083763-bbaf-c5b2-e075-000000000016 11683 1726853255.37405: no more pending results, returning what we have 11683 1726853255.37411: in VariableManager get_vars() 11683 1726853255.37465: Calling all_inventory to load vars for managed_node3 11683 1726853255.37468: Calling groups_inventory to load vars for managed_node3 11683 1726853255.37472: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853255.37489: Calling all_plugins_play to load vars for managed_node3 11683 1726853255.37492: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853255.37495: Calling groups_plugins_play to load vars for managed_node3 11683 1726853255.37919: done sending task result for task 02083763-bbaf-c5b2-e075-000000000016 11683 1726853255.37922: WORKER PROCESS EXITING 11683 1726853255.37949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853255.38174: done with get_vars() 11683 1726853255.38182: variable 'ansible_search_path' from source: unknown 11683 1726853255.38183: variable 'ansible_search_path' from source: unknown 11683 1726853255.38219: we have included files to process 11683 1726853255.38220: generating all_blocks data 11683 1726853255.38223: done generating all_blocks data 11683 1726853255.38223: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11683 1726853255.38224: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11683 1726853255.38226: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11683 1726853255.38443: done processing included file 11683 1726853255.38444: iterating over new_blocks loaded from include file 11683 1726853255.38446: in VariableManager get_vars() 11683 1726853255.38465: done with get_vars() 11683 1726853255.38467: filtering new block on tags 11683 1726853255.38483: done filtering new block on tags 11683 1726853255.38486: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11683 1726853255.38490: extending task lists for all hosts with included blocks 11683 1726853255.38584: done extending task lists 11683 1726853255.38586: done processing included files 11683 1726853255.38586: results queue empty 11683 1726853255.38587: checking for any_errors_fatal 11683 1726853255.38593: done checking for any_errors_fatal 11683 1726853255.38594: checking for max_fail_percentage 11683 1726853255.38595: done checking for max_fail_percentage 11683 1726853255.38595: checking to see if all hosts have failed and the running result is not ok 11683 1726853255.38596: done checking to see if all hosts have failed 11683 1726853255.38597: getting the remaining hosts for this loop 11683 1726853255.38598: done getting the remaining hosts for this loop 11683 1726853255.38600: getting the next task for host managed_node3 11683 1726853255.38605: done getting next task for host managed_node3 11683 1726853255.38606: ^ task is: TASK: Get stat for interface {{ interface }} 11683 1726853255.38609: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853255.38611: getting variables 11683 1726853255.38612: in VariableManager get_vars() 11683 1726853255.38625: Calling all_inventory to load vars for managed_node3 11683 1726853255.38627: Calling groups_inventory to load vars for managed_node3 11683 1726853255.38629: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853255.38634: Calling all_plugins_play to load vars for managed_node3 11683 1726853255.38636: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853255.38638: Calling groups_plugins_play to load vars for managed_node3 11683 1726853255.38791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853255.38969: done with get_vars() 11683 1726853255.38980: done getting variables 11683 1726853255.39148: variable 'interface' from source: task vars 11683 1726853255.39153: variable 'dhcp_interface1' from source: play vars 11683 1726853255.39216: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:27:35 -0400 (0:00:00.033) 0:00:08.464 ****** 11683 1726853255.39255: entering _queue_task() for managed_node3/stat 11683 1726853255.39544: worker is 1 (out of 1 available) 11683 1726853255.39556: exiting _queue_task() for managed_node3/stat 11683 1726853255.39567: done queuing things up, now waiting for results queue to drain 11683 1726853255.39569: waiting for pending results... 11683 1726853255.39818: running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 11683 1726853255.39937: in run() - task 02083763-bbaf-c5b2-e075-000000000152 11683 1726853255.39958: variable 'ansible_search_path' from source: unknown 11683 1726853255.39965: variable 'ansible_search_path' from source: unknown 11683 1726853255.40008: calling self._execute() 11683 1726853255.40087: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.40098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.40114: variable 'omit' from source: magic vars 11683 1726853255.40457: variable 'ansible_distribution_major_version' from source: facts 11683 1726853255.40476: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853255.40487: variable 'omit' from source: magic vars 11683 1726853255.40549: variable 'omit' from source: magic vars 11683 1726853255.40648: variable 'interface' from source: task vars 11683 1726853255.40662: variable 'dhcp_interface1' from source: play vars 11683 1726853255.40729: variable 'dhcp_interface1' from source: play vars 11683 1726853255.40753: variable 'omit' from source: magic vars 11683 1726853255.40802: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853255.40844: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853255.40873: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853255.40897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853255.40917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853255.40952: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853255.40962: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.40969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.41083: Set connection var ansible_shell_executable to /bin/sh 11683 1726853255.41100: Set connection var ansible_timeout to 10 11683 1726853255.41112: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853255.41121: Set connection var ansible_pipelining to False 11683 1726853255.41134: Set connection var ansible_shell_type to sh 11683 1726853255.41137: Set connection var ansible_connection to ssh 11683 1726853255.41276: variable 'ansible_shell_executable' from source: unknown 11683 1726853255.41278: variable 'ansible_connection' from source: unknown 11683 1726853255.41281: variable 'ansible_module_compression' from source: unknown 11683 1726853255.41283: variable 'ansible_shell_type' from source: unknown 11683 1726853255.41285: variable 'ansible_shell_executable' from source: unknown 11683 1726853255.41287: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.41288: variable 'ansible_pipelining' from source: unknown 11683 1726853255.41290: variable 'ansible_timeout' from source: unknown 11683 1726853255.41292: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.41403: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853255.41476: variable 'omit' from source: magic vars 11683 1726853255.41480: starting attempt loop 11683 1726853255.41482: running the handler 11683 1726853255.41484: _low_level_execute_command(): starting 11683 1726853255.41486: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853255.42234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853255.42295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.42362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853255.42395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.42410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.42515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.44262: stdout chunk (state=3): >>>/root <<< 11683 1726853255.44414: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.44418: stdout chunk (state=3): >>><<< 11683 1726853255.44420: stderr chunk (state=3): >>><<< 11683 1726853255.44445: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853255.44545: _low_level_execute_command(): starting 11683 1726853255.44551: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241 `" && echo ansible-tmp-1726853255.4445243-12060-218476706574241="` echo /root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241 `" ) && sleep 0' 11683 1726853255.45077: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853255.45088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853255.45099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853255.45119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853255.45130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853255.45133: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853255.45278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853255.45282: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.45284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.45337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.47270: stdout chunk (state=3): >>>ansible-tmp-1726853255.4445243-12060-218476706574241=/root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241 <<< 11683 1726853255.47387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.47458: stderr chunk (state=3): >>><<< 11683 1726853255.47498: stdout chunk (state=3): >>><<< 11683 1726853255.47511: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853255.4445243-12060-218476706574241=/root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853255.47699: variable 'ansible_module_compression' from source: unknown 11683 1726853255.47702: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11683 1726853255.47704: variable 'ansible_facts' from source: unknown 11683 1726853255.47832: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/AnsiballZ_stat.py 11683 1726853255.48120: Sending initial data 11683 1726853255.48124: Sent initial data (153 bytes) 11683 1726853255.48637: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853255.48753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.48794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.48899: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.50534: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853255.50604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853255.50691: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpgl282f70 /root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/AnsiballZ_stat.py <<< 11683 1726853255.50701: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/AnsiballZ_stat.py" <<< 11683 1726853255.50753: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpgl282f70" to remote "/root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/AnsiballZ_stat.py" <<< 11683 1726853255.51608: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.51646: stderr chunk (state=3): >>><<< 11683 1726853255.51650: stdout chunk (state=3): >>><<< 11683 1726853255.51692: done transferring module to remote 11683 1726853255.51708: _low_level_execute_command(): starting 11683 1726853255.51712: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/ /root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/AnsiballZ_stat.py && sleep 0' 11683 1726853255.52131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853255.52140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853255.52168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.52177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853255.52179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853255.52185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.52233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853255.52236: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.52240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.52298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.54182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.54186: stdout chunk (state=3): >>><<< 11683 1726853255.54188: stderr chunk (state=3): >>><<< 11683 1726853255.54282: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853255.54285: _low_level_execute_command(): starting 11683 1726853255.54288: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/AnsiballZ_stat.py && sleep 0' 11683 1726853255.54786: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853255.54792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853255.54859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.54864: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853255.54866: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.54887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.54966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.70641: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27428, "dev": 23, "nlink": 1, "atime": 1726853254.079571, "mtime": 1726853254.079571, "ctime": 1726853254.079571, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11683 1726853255.72011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853255.72040: stderr chunk (state=3): >>><<< 11683 1726853255.72043: stdout chunk (state=3): >>><<< 11683 1726853255.72061: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27428, "dev": 23, "nlink": 1, "atime": 1726853254.079571, "mtime": 1726853254.079571, "ctime": 1726853254.079571, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853255.72100: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853255.72108: _low_level_execute_command(): starting 11683 1726853255.72117: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853255.4445243-12060-218476706574241/ > /dev/null 2>&1 && sleep 0' 11683 1726853255.72552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853255.72585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.72589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853255.72591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.72642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853255.72651: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.72653: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.72710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.74576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.74605: stderr chunk (state=3): >>><<< 11683 1726853255.74608: stdout chunk (state=3): >>><<< 11683 1726853255.74622: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853255.74628: handler run complete 11683 1726853255.74658: attempt loop complete, returning result 11683 1726853255.74661: _execute() done 11683 1726853255.74663: dumping result to json 11683 1726853255.74669: done dumping result, returning 11683 1726853255.74677: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test1 [02083763-bbaf-c5b2-e075-000000000152] 11683 1726853255.74681: sending task result for task 02083763-bbaf-c5b2-e075-000000000152 11683 1726853255.74786: done sending task result for task 02083763-bbaf-c5b2-e075-000000000152 11683 1726853255.74788: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853254.079571, "block_size": 4096, "blocks": 0, "ctime": 1726853254.079571, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27428, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726853254.079571, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11683 1726853255.74894: no more pending results, returning what we have 11683 1726853255.74898: results queue empty 11683 1726853255.74899: checking for any_errors_fatal 11683 1726853255.74900: done checking for any_errors_fatal 11683 1726853255.74900: checking for max_fail_percentage 11683 1726853255.74902: done checking for max_fail_percentage 11683 1726853255.74903: checking to see if all hosts have failed and the running result is not ok 11683 1726853255.74904: done checking to see if all hosts have failed 11683 1726853255.74904: getting the remaining hosts for this loop 11683 1726853255.74906: done getting the remaining hosts for this loop 11683 1726853255.74909: getting the next task for host managed_node3 11683 1726853255.74917: done getting next task for host managed_node3 11683 1726853255.74919: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11683 1726853255.74922: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853255.74925: getting variables 11683 1726853255.74927: in VariableManager get_vars() 11683 1726853255.74962: Calling all_inventory to load vars for managed_node3 11683 1726853255.74964: Calling groups_inventory to load vars for managed_node3 11683 1726853255.74966: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853255.74983: Calling all_plugins_play to load vars for managed_node3 11683 1726853255.74985: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853255.74988: Calling groups_plugins_play to load vars for managed_node3 11683 1726853255.75107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853255.75226: done with get_vars() 11683 1726853255.75234: done getting variables 11683 1726853255.75307: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11683 1726853255.75394: variable 'interface' from source: task vars 11683 1726853255.75397: variable 'dhcp_interface1' from source: play vars 11683 1726853255.75440: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:27:35 -0400 (0:00:00.362) 0:00:08.826 ****** 11683 1726853255.75469: entering _queue_task() for managed_node3/assert 11683 1726853255.75470: Creating lock for assert 11683 1726853255.75677: worker is 1 (out of 1 available) 11683 1726853255.75690: exiting _queue_task() for managed_node3/assert 11683 1726853255.75700: done queuing things up, now waiting for results queue to drain 11683 1726853255.75701: waiting for pending results... 11683 1726853255.75853: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' 11683 1726853255.75923: in run() - task 02083763-bbaf-c5b2-e075-000000000017 11683 1726853255.75936: variable 'ansible_search_path' from source: unknown 11683 1726853255.75940: variable 'ansible_search_path' from source: unknown 11683 1726853255.75964: calling self._execute() 11683 1726853255.76021: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.76024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.76036: variable 'omit' from source: magic vars 11683 1726853255.76329: variable 'ansible_distribution_major_version' from source: facts 11683 1726853255.76338: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853255.76346: variable 'omit' from source: magic vars 11683 1726853255.76378: variable 'omit' from source: magic vars 11683 1726853255.76439: variable 'interface' from source: task vars 11683 1726853255.76442: variable 'dhcp_interface1' from source: play vars 11683 1726853255.76489: variable 'dhcp_interface1' from source: play vars 11683 1726853255.76503: variable 'omit' from source: magic vars 11683 1726853255.76533: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853255.76559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853255.76575: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853255.76593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853255.76601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853255.76625: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853255.76628: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.76630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.76699: Set connection var ansible_shell_executable to /bin/sh 11683 1726853255.76707: Set connection var ansible_timeout to 10 11683 1726853255.76713: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853255.76718: Set connection var ansible_pipelining to False 11683 1726853255.76720: Set connection var ansible_shell_type to sh 11683 1726853255.76723: Set connection var ansible_connection to ssh 11683 1726853255.76738: variable 'ansible_shell_executable' from source: unknown 11683 1726853255.76741: variable 'ansible_connection' from source: unknown 11683 1726853255.76743: variable 'ansible_module_compression' from source: unknown 11683 1726853255.76748: variable 'ansible_shell_type' from source: unknown 11683 1726853255.76750: variable 'ansible_shell_executable' from source: unknown 11683 1726853255.76752: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.76755: variable 'ansible_pipelining' from source: unknown 11683 1726853255.76757: variable 'ansible_timeout' from source: unknown 11683 1726853255.76759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.76882: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853255.76891: variable 'omit' from source: magic vars 11683 1726853255.76896: starting attempt loop 11683 1726853255.76899: running the handler 11683 1726853255.76990: variable 'interface_stat' from source: set_fact 11683 1726853255.77002: Evaluated conditional (interface_stat.stat.exists): True 11683 1726853255.77007: handler run complete 11683 1726853255.77019: attempt loop complete, returning result 11683 1726853255.77022: _execute() done 11683 1726853255.77025: dumping result to json 11683 1726853255.77027: done dumping result, returning 11683 1726853255.77035: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test1' [02083763-bbaf-c5b2-e075-000000000017] 11683 1726853255.77038: sending task result for task 02083763-bbaf-c5b2-e075-000000000017 11683 1726853255.77114: done sending task result for task 02083763-bbaf-c5b2-e075-000000000017 11683 1726853255.77116: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853255.77187: no more pending results, returning what we have 11683 1726853255.77190: results queue empty 11683 1726853255.77192: checking for any_errors_fatal 11683 1726853255.77199: done checking for any_errors_fatal 11683 1726853255.77200: checking for max_fail_percentage 11683 1726853255.77201: done checking for max_fail_percentage 11683 1726853255.77202: checking to see if all hosts have failed and the running result is not ok 11683 1726853255.77203: done checking to see if all hosts have failed 11683 1726853255.77204: getting the remaining hosts for this loop 11683 1726853255.77205: done getting the remaining hosts for this loop 11683 1726853255.77208: getting the next task for host managed_node3 11683 1726853255.77216: done getting next task for host managed_node3 11683 1726853255.77218: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11683 1726853255.77220: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853255.77223: getting variables 11683 1726853255.77224: in VariableManager get_vars() 11683 1726853255.77258: Calling all_inventory to load vars for managed_node3 11683 1726853255.77260: Calling groups_inventory to load vars for managed_node3 11683 1726853255.77262: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853255.77273: Calling all_plugins_play to load vars for managed_node3 11683 1726853255.77275: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853255.77278: Calling groups_plugins_play to load vars for managed_node3 11683 1726853255.77417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853255.77533: done with get_vars() 11683 1726853255.77540: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:27:35 -0400 (0:00:00.021) 0:00:08.848 ****** 11683 1726853255.77604: entering _queue_task() for managed_node3/include_tasks 11683 1726853255.77794: worker is 1 (out of 1 available) 11683 1726853255.77808: exiting _queue_task() for managed_node3/include_tasks 11683 1726853255.77818: done queuing things up, now waiting for results queue to drain 11683 1726853255.77820: waiting for pending results... 11683 1726853255.77969: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11683 1726853255.78035: in run() - task 02083763-bbaf-c5b2-e075-00000000001b 11683 1726853255.78049: variable 'ansible_search_path' from source: unknown 11683 1726853255.78054: variable 'ansible_search_path' from source: unknown 11683 1726853255.78080: calling self._execute() 11683 1726853255.78137: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.78140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.78151: variable 'omit' from source: magic vars 11683 1726853255.78611: variable 'ansible_distribution_major_version' from source: facts 11683 1726853255.78615: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853255.78618: _execute() done 11683 1726853255.78620: dumping result to json 11683 1726853255.78623: done dumping result, returning 11683 1726853255.78626: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-c5b2-e075-00000000001b] 11683 1726853255.78629: sending task result for task 02083763-bbaf-c5b2-e075-00000000001b 11683 1726853255.78702: done sending task result for task 02083763-bbaf-c5b2-e075-00000000001b 11683 1726853255.78705: WORKER PROCESS EXITING 11683 1726853255.78734: no more pending results, returning what we have 11683 1726853255.78739: in VariableManager get_vars() 11683 1726853255.78794: Calling all_inventory to load vars for managed_node3 11683 1726853255.78797: Calling groups_inventory to load vars for managed_node3 11683 1726853255.78799: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853255.78813: Calling all_plugins_play to load vars for managed_node3 11683 1726853255.78815: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853255.78818: Calling groups_plugins_play to load vars for managed_node3 11683 1726853255.79214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853255.79602: done with get_vars() 11683 1726853255.79607: variable 'ansible_search_path' from source: unknown 11683 1726853255.79608: variable 'ansible_search_path' from source: unknown 11683 1726853255.79630: we have included files to process 11683 1726853255.79630: generating all_blocks data 11683 1726853255.79631: done generating all_blocks data 11683 1726853255.79633: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11683 1726853255.79633: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11683 1726853255.79635: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11683 1726853255.79755: done processing included file 11683 1726853255.79757: iterating over new_blocks loaded from include file 11683 1726853255.79758: in VariableManager get_vars() 11683 1726853255.79770: done with get_vars() 11683 1726853255.79773: filtering new block on tags 11683 1726853255.79784: done filtering new block on tags 11683 1726853255.79785: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11683 1726853255.79788: extending task lists for all hosts with included blocks 11683 1726853255.79843: done extending task lists 11683 1726853255.79844: done processing included files 11683 1726853255.79845: results queue empty 11683 1726853255.79846: checking for any_errors_fatal 11683 1726853255.79848: done checking for any_errors_fatal 11683 1726853255.79848: checking for max_fail_percentage 11683 1726853255.79849: done checking for max_fail_percentage 11683 1726853255.79849: checking to see if all hosts have failed and the running result is not ok 11683 1726853255.79850: done checking to see if all hosts have failed 11683 1726853255.79850: getting the remaining hosts for this loop 11683 1726853255.79851: done getting the remaining hosts for this loop 11683 1726853255.79853: getting the next task for host managed_node3 11683 1726853255.79855: done getting next task for host managed_node3 11683 1726853255.79857: ^ task is: TASK: Get stat for interface {{ interface }} 11683 1726853255.79860: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853255.79861: getting variables 11683 1726853255.79862: in VariableManager get_vars() 11683 1726853255.79873: Calling all_inventory to load vars for managed_node3 11683 1726853255.79874: Calling groups_inventory to load vars for managed_node3 11683 1726853255.79875: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853255.79879: Calling all_plugins_play to load vars for managed_node3 11683 1726853255.79880: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853255.79882: Calling groups_plugins_play to load vars for managed_node3 11683 1726853255.79959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853255.80072: done with get_vars() 11683 1726853255.80079: done getting variables 11683 1726853255.80184: variable 'interface' from source: task vars 11683 1726853255.80187: variable 'dhcp_interface2' from source: play vars 11683 1726853255.80228: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:27:35 -0400 (0:00:00.026) 0:00:08.874 ****** 11683 1726853255.80249: entering _queue_task() for managed_node3/stat 11683 1726853255.80450: worker is 1 (out of 1 available) 11683 1726853255.80464: exiting _queue_task() for managed_node3/stat 11683 1726853255.80476: done queuing things up, now waiting for results queue to drain 11683 1726853255.80478: waiting for pending results... 11683 1726853255.80635: running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 11683 1726853255.80719: in run() - task 02083763-bbaf-c5b2-e075-00000000016a 11683 1726853255.80730: variable 'ansible_search_path' from source: unknown 11683 1726853255.80734: variable 'ansible_search_path' from source: unknown 11683 1726853255.80763: calling self._execute() 11683 1726853255.80825: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.80829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.80839: variable 'omit' from source: magic vars 11683 1726853255.81089: variable 'ansible_distribution_major_version' from source: facts 11683 1726853255.81098: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853255.81103: variable 'omit' from source: magic vars 11683 1726853255.81140: variable 'omit' from source: magic vars 11683 1726853255.81376: variable 'interface' from source: task vars 11683 1726853255.81379: variable 'dhcp_interface2' from source: play vars 11683 1726853255.81382: variable 'dhcp_interface2' from source: play vars 11683 1726853255.81384: variable 'omit' from source: magic vars 11683 1726853255.81385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853255.81509: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853255.81535: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853255.81561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853255.81580: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853255.81616: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853255.81624: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.81632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.81728: Set connection var ansible_shell_executable to /bin/sh 11683 1726853255.81741: Set connection var ansible_timeout to 10 11683 1726853255.81753: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853255.81761: Set connection var ansible_pipelining to False 11683 1726853255.81766: Set connection var ansible_shell_type to sh 11683 1726853255.81773: Set connection var ansible_connection to ssh 11683 1726853255.81864: variable 'ansible_shell_executable' from source: unknown 11683 1726853255.81877: variable 'ansible_connection' from source: unknown 11683 1726853255.81885: variable 'ansible_module_compression' from source: unknown 11683 1726853255.81892: variable 'ansible_shell_type' from source: unknown 11683 1726853255.81899: variable 'ansible_shell_executable' from source: unknown 11683 1726853255.81905: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853255.81913: variable 'ansible_pipelining' from source: unknown 11683 1726853255.81921: variable 'ansible_timeout' from source: unknown 11683 1726853255.81929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853255.82110: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853255.82124: variable 'omit' from source: magic vars 11683 1726853255.82175: starting attempt loop 11683 1726853255.82178: running the handler 11683 1726853255.82181: _low_level_execute_command(): starting 11683 1726853255.82182: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853255.82880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853255.82896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853255.82908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853255.82929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853255.82950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853255.82962: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853255.82978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.82995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853255.83007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853255.83097: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.83124: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.83217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.85015: stdout chunk (state=3): >>>/root <<< 11683 1726853255.85074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.85139: stderr chunk (state=3): >>><<< 11683 1726853255.85153: stdout chunk (state=3): >>><<< 11683 1726853255.85185: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853255.85291: _low_level_execute_command(): starting 11683 1726853255.85295: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009 `" && echo ansible-tmp-1726853255.8519354-12079-236654441836009="` echo /root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009 `" ) && sleep 0' 11683 1726853255.85887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853255.85910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853255.85970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.86036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853255.86059: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.86101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.86383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.88358: stdout chunk (state=3): >>>ansible-tmp-1726853255.8519354-12079-236654441836009=/root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009 <<< 11683 1726853255.88775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.88836: stderr chunk (state=3): >>><<< 11683 1726853255.88848: stdout chunk (state=3): >>><<< 11683 1726853255.88878: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853255.8519354-12079-236654441836009=/root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853255.88930: variable 'ansible_module_compression' from source: unknown 11683 1726853255.88993: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11683 1726853255.89039: variable 'ansible_facts' from source: unknown 11683 1726853255.89140: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/AnsiballZ_stat.py 11683 1726853255.89295: Sending initial data 11683 1726853255.89306: Sent initial data (153 bytes) 11683 1726853255.90292: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.90346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.92013: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853255.92084: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853255.92169: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpj5ews5di /root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/AnsiballZ_stat.py <<< 11683 1726853255.92174: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/AnsiballZ_stat.py" <<< 11683 1726853255.92216: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11683 1726853255.92247: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpj5ews5di" to remote "/root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/AnsiballZ_stat.py" <<< 11683 1726853255.93429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.93574: stderr chunk (state=3): >>><<< 11683 1726853255.93578: stdout chunk (state=3): >>><<< 11683 1726853255.93581: done transferring module to remote 11683 1726853255.93583: _low_level_execute_command(): starting 11683 1726853255.93585: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/ /root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/AnsiballZ_stat.py && sleep 0' 11683 1726853255.94120: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853255.94189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.94245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853255.94261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.94287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.94377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853255.96341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853255.96346: stdout chunk (state=3): >>><<< 11683 1726853255.96474: stderr chunk (state=3): >>><<< 11683 1726853255.96482: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853255.96485: _low_level_execute_command(): starting 11683 1726853255.96488: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/AnsiballZ_stat.py && sleep 0' 11683 1726853255.97367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853255.97386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853255.97400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853255.97415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853255.97433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853255.97477: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853255.97551: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853255.97586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853255.97599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853255.97710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853256.13308: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27834, "dev": 23, "nlink": 1, "atime": 1726853254.082611, "mtime": 1726853254.082611, "ctime": 1726853254.082611, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11683 1726853256.14786: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853256.14790: stdout chunk (state=3): >>><<< 11683 1726853256.14793: stderr chunk (state=3): >>><<< 11683 1726853256.14795: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27834, "dev": 23, "nlink": 1, "atime": 1726853254.082611, "mtime": 1726853254.082611, "ctime": 1726853254.082611, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853256.14889: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853256.14985: _low_level_execute_command(): starting 11683 1726853256.15277: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853255.8519354-12079-236654441836009/ > /dev/null 2>&1 && sleep 0' 11683 1726853256.15848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853256.15862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853256.15906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11683 1726853256.15921: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853256.16018: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853256.16042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853256.16148: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853256.18117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853256.18140: stderr chunk (state=3): >>><<< 11683 1726853256.18153: stdout chunk (state=3): >>><<< 11683 1726853256.18377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853256.18380: handler run complete 11683 1726853256.18382: attempt loop complete, returning result 11683 1726853256.18383: _execute() done 11683 1726853256.18385: dumping result to json 11683 1726853256.18387: done dumping result, returning 11683 1726853256.18388: done running TaskExecutor() for managed_node3/TASK: Get stat for interface test2 [02083763-bbaf-c5b2-e075-00000000016a] 11683 1726853256.18390: sending task result for task 02083763-bbaf-c5b2-e075-00000000016a 11683 1726853256.18459: done sending task result for task 02083763-bbaf-c5b2-e075-00000000016a 11683 1726853256.18462: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853254.082611, "block_size": 4096, "blocks": 0, "ctime": 1726853254.082611, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27834, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726853254.082611, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11683 1726853256.18556: no more pending results, returning what we have 11683 1726853256.18560: results queue empty 11683 1726853256.18560: checking for any_errors_fatal 11683 1726853256.18562: done checking for any_errors_fatal 11683 1726853256.18562: checking for max_fail_percentage 11683 1726853256.18564: done checking for max_fail_percentage 11683 1726853256.18565: checking to see if all hosts have failed and the running result is not ok 11683 1726853256.18566: done checking to see if all hosts have failed 11683 1726853256.18566: getting the remaining hosts for this loop 11683 1726853256.18568: done getting the remaining hosts for this loop 11683 1726853256.18574: getting the next task for host managed_node3 11683 1726853256.18581: done getting next task for host managed_node3 11683 1726853256.18583: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11683 1726853256.18586: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853256.18590: getting variables 11683 1726853256.18592: in VariableManager get_vars() 11683 1726853256.18632: Calling all_inventory to load vars for managed_node3 11683 1726853256.18635: Calling groups_inventory to load vars for managed_node3 11683 1726853256.18728: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853256.18740: Calling all_plugins_play to load vars for managed_node3 11683 1726853256.18743: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853256.18942: Calling groups_plugins_play to load vars for managed_node3 11683 1726853256.19166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853256.19362: done with get_vars() 11683 1726853256.19375: done getting variables 11683 1726853256.19432: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853256.19548: variable 'interface' from source: task vars 11683 1726853256.19552: variable 'dhcp_interface2' from source: play vars 11683 1726853256.19616: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:27:36 -0400 (0:00:00.393) 0:00:09.268 ****** 11683 1726853256.19653: entering _queue_task() for managed_node3/assert 11683 1726853256.19930: worker is 1 (out of 1 available) 11683 1726853256.19943: exiting _queue_task() for managed_node3/assert 11683 1726853256.19953: done queuing things up, now waiting for results queue to drain 11683 1726853256.19955: waiting for pending results... 11683 1726853256.20299: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' 11683 1726853256.20380: in run() - task 02083763-bbaf-c5b2-e075-00000000001c 11683 1726853256.20383: variable 'ansible_search_path' from source: unknown 11683 1726853256.20386: variable 'ansible_search_path' from source: unknown 11683 1726853256.20402: calling self._execute() 11683 1726853256.20486: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.20498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.20514: variable 'omit' from source: magic vars 11683 1726853256.20937: variable 'ansible_distribution_major_version' from source: facts 11683 1726853256.20941: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853256.20943: variable 'omit' from source: magic vars 11683 1726853256.20945: variable 'omit' from source: magic vars 11683 1726853256.21035: variable 'interface' from source: task vars 11683 1726853256.21044: variable 'dhcp_interface2' from source: play vars 11683 1726853256.21117: variable 'dhcp_interface2' from source: play vars 11683 1726853256.21141: variable 'omit' from source: magic vars 11683 1726853256.21192: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853256.21230: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853256.21254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853256.21280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853256.21295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853256.21327: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853256.21335: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.21384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.21449: Set connection var ansible_shell_executable to /bin/sh 11683 1726853256.21465: Set connection var ansible_timeout to 10 11683 1726853256.21478: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853256.21488: Set connection var ansible_pipelining to False 11683 1726853256.21498: Set connection var ansible_shell_type to sh 11683 1726853256.21504: Set connection var ansible_connection to ssh 11683 1726853256.21527: variable 'ansible_shell_executable' from source: unknown 11683 1726853256.21534: variable 'ansible_connection' from source: unknown 11683 1726853256.21775: variable 'ansible_module_compression' from source: unknown 11683 1726853256.21778: variable 'ansible_shell_type' from source: unknown 11683 1726853256.21780: variable 'ansible_shell_executable' from source: unknown 11683 1726853256.21782: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.21785: variable 'ansible_pipelining' from source: unknown 11683 1726853256.21787: variable 'ansible_timeout' from source: unknown 11683 1726853256.21789: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.21791: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853256.21794: variable 'omit' from source: magic vars 11683 1726853256.21796: starting attempt loop 11683 1726853256.21798: running the handler 11683 1726853256.21855: variable 'interface_stat' from source: set_fact 11683 1726853256.21882: Evaluated conditional (interface_stat.stat.exists): True 11683 1726853256.21894: handler run complete 11683 1726853256.21917: attempt loop complete, returning result 11683 1726853256.21925: _execute() done 11683 1726853256.21932: dumping result to json 11683 1726853256.21939: done dumping result, returning 11683 1726853256.21950: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'test2' [02083763-bbaf-c5b2-e075-00000000001c] 11683 1726853256.21959: sending task result for task 02083763-bbaf-c5b2-e075-00000000001c ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853256.22104: no more pending results, returning what we have 11683 1726853256.22108: results queue empty 11683 1726853256.22109: checking for any_errors_fatal 11683 1726853256.22118: done checking for any_errors_fatal 11683 1726853256.22119: checking for max_fail_percentage 11683 1726853256.22121: done checking for max_fail_percentage 11683 1726853256.22122: checking to see if all hosts have failed and the running result is not ok 11683 1726853256.22123: done checking to see if all hosts have failed 11683 1726853256.22124: getting the remaining hosts for this loop 11683 1726853256.22126: done getting the remaining hosts for this loop 11683 1726853256.22129: getting the next task for host managed_node3 11683 1726853256.22136: done getting next task for host managed_node3 11683 1726853256.22139: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 11683 1726853256.22141: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853256.22145: getting variables 11683 1726853256.22147: in VariableManager get_vars() 11683 1726853256.22200: Calling all_inventory to load vars for managed_node3 11683 1726853256.22203: Calling groups_inventory to load vars for managed_node3 11683 1726853256.22205: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853256.22217: Calling all_plugins_play to load vars for managed_node3 11683 1726853256.22220: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853256.22224: Calling groups_plugins_play to load vars for managed_node3 11683 1726853256.22674: done sending task result for task 02083763-bbaf-c5b2-e075-00000000001c 11683 1726853256.22678: WORKER PROCESS EXITING 11683 1726853256.22701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853256.22897: done with get_vars() 11683 1726853256.22908: done getting variables 11683 1726853256.22965: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Friday 20 September 2024 13:27:36 -0400 (0:00:00.033) 0:00:09.301 ****** 11683 1726853256.22993: entering _queue_task() for managed_node3/command 11683 1726853256.23244: worker is 1 (out of 1 available) 11683 1726853256.23255: exiting _queue_task() for managed_node3/command 11683 1726853256.23267: done queuing things up, now waiting for results queue to drain 11683 1726853256.23268: waiting for pending results... 11683 1726853256.23520: running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript 11683 1726853256.23610: in run() - task 02083763-bbaf-c5b2-e075-00000000001d 11683 1726853256.23628: variable 'ansible_search_path' from source: unknown 11683 1726853256.23664: calling self._execute() 11683 1726853256.23752: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.23763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.23780: variable 'omit' from source: magic vars 11683 1726853256.24200: variable 'ansible_distribution_major_version' from source: facts 11683 1726853256.24215: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853256.24332: variable 'network_provider' from source: set_fact 11683 1726853256.24345: Evaluated conditional (network_provider == "initscripts"): False 11683 1726853256.24353: when evaluation is False, skipping this task 11683 1726853256.24360: _execute() done 11683 1726853256.24366: dumping result to json 11683 1726853256.24374: done dumping result, returning 11683 1726853256.24388: done running TaskExecutor() for managed_node3/TASK: Backup the /etc/resolv.conf for initscript [02083763-bbaf-c5b2-e075-00000000001d] 11683 1726853256.24395: sending task result for task 02083763-bbaf-c5b2-e075-00000000001d skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11683 1726853256.24532: no more pending results, returning what we have 11683 1726853256.24536: results queue empty 11683 1726853256.24537: checking for any_errors_fatal 11683 1726853256.24542: done checking for any_errors_fatal 11683 1726853256.24543: checking for max_fail_percentage 11683 1726853256.24544: done checking for max_fail_percentage 11683 1726853256.24545: checking to see if all hosts have failed and the running result is not ok 11683 1726853256.24547: done checking to see if all hosts have failed 11683 1726853256.24547: getting the remaining hosts for this loop 11683 1726853256.24549: done getting the remaining hosts for this loop 11683 1726853256.24552: getting the next task for host managed_node3 11683 1726853256.24558: done getting next task for host managed_node3 11683 1726853256.24560: ^ task is: TASK: TEST Add Bond with 2 ports 11683 1726853256.24563: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853256.24566: getting variables 11683 1726853256.24568: in VariableManager get_vars() 11683 1726853256.24609: Calling all_inventory to load vars for managed_node3 11683 1726853256.24612: Calling groups_inventory to load vars for managed_node3 11683 1726853256.24614: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853256.24628: Calling all_plugins_play to load vars for managed_node3 11683 1726853256.24631: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853256.24633: Calling groups_plugins_play to load vars for managed_node3 11683 1726853256.25048: done sending task result for task 02083763-bbaf-c5b2-e075-00000000001d 11683 1726853256.25052: WORKER PROCESS EXITING 11683 1726853256.25074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853256.25267: done with get_vars() 11683 1726853256.25277: done getting variables 11683 1726853256.25328: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Friday 20 September 2024 13:27:36 -0400 (0:00:00.023) 0:00:09.325 ****** 11683 1726853256.25352: entering _queue_task() for managed_node3/debug 11683 1726853256.25566: worker is 1 (out of 1 available) 11683 1726853256.25579: exiting _queue_task() for managed_node3/debug 11683 1726853256.25589: done queuing things up, now waiting for results queue to drain 11683 1726853256.25590: waiting for pending results... 11683 1726853256.25829: running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports 11683 1726853256.25980: in run() - task 02083763-bbaf-c5b2-e075-00000000001e 11683 1726853256.25983: variable 'ansible_search_path' from source: unknown 11683 1726853256.25986: calling self._execute() 11683 1726853256.26049: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.26059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.26075: variable 'omit' from source: magic vars 11683 1726853256.26425: variable 'ansible_distribution_major_version' from source: facts 11683 1726853256.26439: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853256.26447: variable 'omit' from source: magic vars 11683 1726853256.26466: variable 'omit' from source: magic vars 11683 1726853256.26501: variable 'omit' from source: magic vars 11683 1726853256.26543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853256.26581: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853256.26603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853256.26624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853256.26643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853256.26976: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853256.26980: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.26982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.26984: Set connection var ansible_shell_executable to /bin/sh 11683 1726853256.26987: Set connection var ansible_timeout to 10 11683 1726853256.26989: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853256.26992: Set connection var ansible_pipelining to False 11683 1726853256.26994: Set connection var ansible_shell_type to sh 11683 1726853256.26996: Set connection var ansible_connection to ssh 11683 1726853256.26998: variable 'ansible_shell_executable' from source: unknown 11683 1726853256.27000: variable 'ansible_connection' from source: unknown 11683 1726853256.27002: variable 'ansible_module_compression' from source: unknown 11683 1726853256.27004: variable 'ansible_shell_type' from source: unknown 11683 1726853256.27007: variable 'ansible_shell_executable' from source: unknown 11683 1726853256.27009: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.27011: variable 'ansible_pipelining' from source: unknown 11683 1726853256.27013: variable 'ansible_timeout' from source: unknown 11683 1726853256.27016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.27035: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853256.27074: variable 'omit' from source: magic vars 11683 1726853256.27088: starting attempt loop 11683 1726853256.27109: running the handler 11683 1726853256.27162: handler run complete 11683 1726853256.27211: attempt loop complete, returning result 11683 1726853256.27219: _execute() done 11683 1726853256.27234: dumping result to json 11683 1726853256.27246: done dumping result, returning 11683 1726853256.27259: done running TaskExecutor() for managed_node3/TASK: TEST Add Bond with 2 ports [02083763-bbaf-c5b2-e075-00000000001e] 11683 1726853256.27269: sending task result for task 02083763-bbaf-c5b2-e075-00000000001e ok: [managed_node3] => {} MSG: ################################################## 11683 1726853256.27418: no more pending results, returning what we have 11683 1726853256.27422: results queue empty 11683 1726853256.27423: checking for any_errors_fatal 11683 1726853256.27427: done checking for any_errors_fatal 11683 1726853256.27428: checking for max_fail_percentage 11683 1726853256.27429: done checking for max_fail_percentage 11683 1726853256.27430: checking to see if all hosts have failed and the running result is not ok 11683 1726853256.27431: done checking to see if all hosts have failed 11683 1726853256.27432: getting the remaining hosts for this loop 11683 1726853256.27434: done getting the remaining hosts for this loop 11683 1726853256.27437: getting the next task for host managed_node3 11683 1726853256.27443: done getting next task for host managed_node3 11683 1726853256.27447: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11683 1726853256.27451: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853256.27465: getting variables 11683 1726853256.27467: in VariableManager get_vars() 11683 1726853256.27507: Calling all_inventory to load vars for managed_node3 11683 1726853256.27510: Calling groups_inventory to load vars for managed_node3 11683 1726853256.27512: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853256.27522: Calling all_plugins_play to load vars for managed_node3 11683 1726853256.27524: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853256.27527: Calling groups_plugins_play to load vars for managed_node3 11683 1726853256.27673: done sending task result for task 02083763-bbaf-c5b2-e075-00000000001e 11683 1726853256.27677: WORKER PROCESS EXITING 11683 1726853256.27699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853256.27934: done with get_vars() 11683 1726853256.27944: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:27:36 -0400 (0:00:00.026) 0:00:09.352 ****** 11683 1726853256.28036: entering _queue_task() for managed_node3/include_tasks 11683 1726853256.28280: worker is 1 (out of 1 available) 11683 1726853256.28295: exiting _queue_task() for managed_node3/include_tasks 11683 1726853256.28307: done queuing things up, now waiting for results queue to drain 11683 1726853256.28308: waiting for pending results... 11683 1726853256.28562: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11683 1726853256.28692: in run() - task 02083763-bbaf-c5b2-e075-000000000026 11683 1726853256.28713: variable 'ansible_search_path' from source: unknown 11683 1726853256.28721: variable 'ansible_search_path' from source: unknown 11683 1726853256.28761: calling self._execute() 11683 1726853256.28841: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.28853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.28868: variable 'omit' from source: magic vars 11683 1726853256.29212: variable 'ansible_distribution_major_version' from source: facts 11683 1726853256.29276: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853256.29279: _execute() done 11683 1726853256.29282: dumping result to json 11683 1726853256.29284: done dumping result, returning 11683 1726853256.29287: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-c5b2-e075-000000000026] 11683 1726853256.29289: sending task result for task 02083763-bbaf-c5b2-e075-000000000026 11683 1726853256.29517: done sending task result for task 02083763-bbaf-c5b2-e075-000000000026 11683 1726853256.29520: WORKER PROCESS EXITING 11683 1726853256.29554: no more pending results, returning what we have 11683 1726853256.29559: in VariableManager get_vars() 11683 1726853256.29602: Calling all_inventory to load vars for managed_node3 11683 1726853256.29604: Calling groups_inventory to load vars for managed_node3 11683 1726853256.29606: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853256.29617: Calling all_plugins_play to load vars for managed_node3 11683 1726853256.29619: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853256.29622: Calling groups_plugins_play to load vars for managed_node3 11683 1726853256.29848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853256.30044: done with get_vars() 11683 1726853256.30051: variable 'ansible_search_path' from source: unknown 11683 1726853256.30052: variable 'ansible_search_path' from source: unknown 11683 1726853256.30094: we have included files to process 11683 1726853256.30095: generating all_blocks data 11683 1726853256.30097: done generating all_blocks data 11683 1726853256.30101: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11683 1726853256.30102: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11683 1726853256.30104: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11683 1726853256.30779: done processing included file 11683 1726853256.30781: iterating over new_blocks loaded from include file 11683 1726853256.30782: in VariableManager get_vars() 11683 1726853256.30805: done with get_vars() 11683 1726853256.30807: filtering new block on tags 11683 1726853256.30823: done filtering new block on tags 11683 1726853256.30825: in VariableManager get_vars() 11683 1726853256.30846: done with get_vars() 11683 1726853256.30847: filtering new block on tags 11683 1726853256.30866: done filtering new block on tags 11683 1726853256.30868: in VariableManager get_vars() 11683 1726853256.30892: done with get_vars() 11683 1726853256.30893: filtering new block on tags 11683 1726853256.30911: done filtering new block on tags 11683 1726853256.30913: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11683 1726853256.30917: extending task lists for all hosts with included blocks 11683 1726853256.31702: done extending task lists 11683 1726853256.31704: done processing included files 11683 1726853256.31705: results queue empty 11683 1726853256.31705: checking for any_errors_fatal 11683 1726853256.31708: done checking for any_errors_fatal 11683 1726853256.31709: checking for max_fail_percentage 11683 1726853256.31710: done checking for max_fail_percentage 11683 1726853256.31711: checking to see if all hosts have failed and the running result is not ok 11683 1726853256.31712: done checking to see if all hosts have failed 11683 1726853256.31713: getting the remaining hosts for this loop 11683 1726853256.31714: done getting the remaining hosts for this loop 11683 1726853256.31716: getting the next task for host managed_node3 11683 1726853256.31720: done getting next task for host managed_node3 11683 1726853256.31722: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11683 1726853256.31725: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853256.31735: getting variables 11683 1726853256.31736: in VariableManager get_vars() 11683 1726853256.31750: Calling all_inventory to load vars for managed_node3 11683 1726853256.31753: Calling groups_inventory to load vars for managed_node3 11683 1726853256.31755: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853256.31760: Calling all_plugins_play to load vars for managed_node3 11683 1726853256.31762: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853256.31765: Calling groups_plugins_play to load vars for managed_node3 11683 1726853256.31923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853256.32118: done with get_vars() 11683 1726853256.32126: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:27:36 -0400 (0:00:00.041) 0:00:09.393 ****** 11683 1726853256.32194: entering _queue_task() for managed_node3/setup 11683 1726853256.32461: worker is 1 (out of 1 available) 11683 1726853256.32577: exiting _queue_task() for managed_node3/setup 11683 1726853256.32586: done queuing things up, now waiting for results queue to drain 11683 1726853256.32587: waiting for pending results... 11683 1726853256.32755: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11683 1726853256.32881: in run() - task 02083763-bbaf-c5b2-e075-000000000188 11683 1726853256.32899: variable 'ansible_search_path' from source: unknown 11683 1726853256.32905: variable 'ansible_search_path' from source: unknown 11683 1726853256.32943: calling self._execute() 11683 1726853256.33029: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.33085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.33102: variable 'omit' from source: magic vars 11683 1726853256.33554: variable 'ansible_distribution_major_version' from source: facts 11683 1726853256.33575: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853256.33840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853256.36141: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853256.36190: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853256.36219: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853256.36245: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853256.36267: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853256.36329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853256.36353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853256.36369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853256.36397: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853256.36407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853256.36451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853256.36467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853256.36485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853256.36509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853256.36522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853256.36632: variable '__network_required_facts' from source: role '' defaults 11683 1726853256.36642: variable 'ansible_facts' from source: unknown 11683 1726853256.36705: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11683 1726853256.36709: when evaluation is False, skipping this task 11683 1726853256.36712: _execute() done 11683 1726853256.36715: dumping result to json 11683 1726853256.36717: done dumping result, returning 11683 1726853256.36724: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-c5b2-e075-000000000188] 11683 1726853256.36727: sending task result for task 02083763-bbaf-c5b2-e075-000000000188 11683 1726853256.36815: done sending task result for task 02083763-bbaf-c5b2-e075-000000000188 11683 1726853256.36817: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853256.36887: no more pending results, returning what we have 11683 1726853256.36891: results queue empty 11683 1726853256.36892: checking for any_errors_fatal 11683 1726853256.36893: done checking for any_errors_fatal 11683 1726853256.36894: checking for max_fail_percentage 11683 1726853256.36895: done checking for max_fail_percentage 11683 1726853256.36896: checking to see if all hosts have failed and the running result is not ok 11683 1726853256.36897: done checking to see if all hosts have failed 11683 1726853256.36898: getting the remaining hosts for this loop 11683 1726853256.36899: done getting the remaining hosts for this loop 11683 1726853256.36902: getting the next task for host managed_node3 11683 1726853256.36910: done getting next task for host managed_node3 11683 1726853256.36914: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11683 1726853256.36918: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853256.36933: getting variables 11683 1726853256.36934: in VariableManager get_vars() 11683 1726853256.36974: Calling all_inventory to load vars for managed_node3 11683 1726853256.36976: Calling groups_inventory to load vars for managed_node3 11683 1726853256.36978: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853256.36987: Calling all_plugins_play to load vars for managed_node3 11683 1726853256.36989: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853256.36991: Calling groups_plugins_play to load vars for managed_node3 11683 1726853256.37121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853256.37244: done with get_vars() 11683 1726853256.37253: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:27:36 -0400 (0:00:00.051) 0:00:09.445 ****** 11683 1726853256.37326: entering _queue_task() for managed_node3/stat 11683 1726853256.37594: worker is 1 (out of 1 available) 11683 1726853256.37609: exiting _queue_task() for managed_node3/stat 11683 1726853256.37621: done queuing things up, now waiting for results queue to drain 11683 1726853256.37622: waiting for pending results... 11683 1726853256.37990: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11683 1726853256.37995: in run() - task 02083763-bbaf-c5b2-e075-00000000018a 11683 1726853256.37998: variable 'ansible_search_path' from source: unknown 11683 1726853256.38000: variable 'ansible_search_path' from source: unknown 11683 1726853256.38023: calling self._execute() 11683 1726853256.38105: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.38116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.38131: variable 'omit' from source: magic vars 11683 1726853256.38470: variable 'ansible_distribution_major_version' from source: facts 11683 1726853256.38489: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853256.38661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853256.38918: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853256.38956: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853256.38981: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853256.39006: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853256.39073: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853256.39092: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853256.39110: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853256.39127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853256.39194: variable '__network_is_ostree' from source: set_fact 11683 1726853256.39200: Evaluated conditional (not __network_is_ostree is defined): False 11683 1726853256.39203: when evaluation is False, skipping this task 11683 1726853256.39206: _execute() done 11683 1726853256.39208: dumping result to json 11683 1726853256.39211: done dumping result, returning 11683 1726853256.39218: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-c5b2-e075-00000000018a] 11683 1726853256.39224: sending task result for task 02083763-bbaf-c5b2-e075-00000000018a 11683 1726853256.39310: done sending task result for task 02083763-bbaf-c5b2-e075-00000000018a 11683 1726853256.39313: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11683 1726853256.39357: no more pending results, returning what we have 11683 1726853256.39361: results queue empty 11683 1726853256.39362: checking for any_errors_fatal 11683 1726853256.39367: done checking for any_errors_fatal 11683 1726853256.39368: checking for max_fail_percentage 11683 1726853256.39369: done checking for max_fail_percentage 11683 1726853256.39372: checking to see if all hosts have failed and the running result is not ok 11683 1726853256.39373: done checking to see if all hosts have failed 11683 1726853256.39374: getting the remaining hosts for this loop 11683 1726853256.39375: done getting the remaining hosts for this loop 11683 1726853256.39378: getting the next task for host managed_node3 11683 1726853256.39385: done getting next task for host managed_node3 11683 1726853256.39388: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11683 1726853256.39391: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853256.39404: getting variables 11683 1726853256.39405: in VariableManager get_vars() 11683 1726853256.39440: Calling all_inventory to load vars for managed_node3 11683 1726853256.39442: Calling groups_inventory to load vars for managed_node3 11683 1726853256.39445: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853256.39453: Calling all_plugins_play to load vars for managed_node3 11683 1726853256.39455: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853256.39457: Calling groups_plugins_play to load vars for managed_node3 11683 1726853256.39609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853256.39729: done with get_vars() 11683 1726853256.39736: done getting variables 11683 1726853256.39776: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:27:36 -0400 (0:00:00.024) 0:00:09.469 ****** 11683 1726853256.39799: entering _queue_task() for managed_node3/set_fact 11683 1726853256.39984: worker is 1 (out of 1 available) 11683 1726853256.39995: exiting _queue_task() for managed_node3/set_fact 11683 1726853256.40006: done queuing things up, now waiting for results queue to drain 11683 1726853256.40007: waiting for pending results... 11683 1726853256.40155: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11683 1726853256.40236: in run() - task 02083763-bbaf-c5b2-e075-00000000018b 11683 1726853256.40251: variable 'ansible_search_path' from source: unknown 11683 1726853256.40256: variable 'ansible_search_path' from source: unknown 11683 1726853256.40283: calling self._execute() 11683 1726853256.40337: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.40341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.40356: variable 'omit' from source: magic vars 11683 1726853256.40699: variable 'ansible_distribution_major_version' from source: facts 11683 1726853256.40703: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853256.40876: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853256.41045: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853256.41095: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853256.41131: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853256.41167: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853256.41252: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853256.41286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853256.41317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853256.41348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853256.41439: variable '__network_is_ostree' from source: set_fact 11683 1726853256.41452: Evaluated conditional (not __network_is_ostree is defined): False 11683 1726853256.41460: when evaluation is False, skipping this task 11683 1726853256.41467: _execute() done 11683 1726853256.41477: dumping result to json 11683 1726853256.41483: done dumping result, returning 11683 1726853256.41494: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-c5b2-e075-00000000018b] 11683 1726853256.41503: sending task result for task 02083763-bbaf-c5b2-e075-00000000018b 11683 1726853256.41691: done sending task result for task 02083763-bbaf-c5b2-e075-00000000018b 11683 1726853256.41694: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11683 1726853256.41739: no more pending results, returning what we have 11683 1726853256.41743: results queue empty 11683 1726853256.41744: checking for any_errors_fatal 11683 1726853256.41748: done checking for any_errors_fatal 11683 1726853256.41748: checking for max_fail_percentage 11683 1726853256.41750: done checking for max_fail_percentage 11683 1726853256.41751: checking to see if all hosts have failed and the running result is not ok 11683 1726853256.41753: done checking to see if all hosts have failed 11683 1726853256.41753: getting the remaining hosts for this loop 11683 1726853256.41755: done getting the remaining hosts for this loop 11683 1726853256.41758: getting the next task for host managed_node3 11683 1726853256.41769: done getting next task for host managed_node3 11683 1726853256.41775: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11683 1726853256.41778: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853256.41793: getting variables 11683 1726853256.41795: in VariableManager get_vars() 11683 1726853256.41837: Calling all_inventory to load vars for managed_node3 11683 1726853256.41840: Calling groups_inventory to load vars for managed_node3 11683 1726853256.41842: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853256.41853: Calling all_plugins_play to load vars for managed_node3 11683 1726853256.41856: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853256.41859: Calling groups_plugins_play to load vars for managed_node3 11683 1726853256.42135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853256.42268: done with get_vars() 11683 1726853256.42277: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:27:36 -0400 (0:00:00.025) 0:00:09.495 ****** 11683 1726853256.42340: entering _queue_task() for managed_node3/service_facts 11683 1726853256.42341: Creating lock for service_facts 11683 1726853256.42527: worker is 1 (out of 1 available) 11683 1726853256.42540: exiting _queue_task() for managed_node3/service_facts 11683 1726853256.42550: done queuing things up, now waiting for results queue to drain 11683 1726853256.42551: waiting for pending results... 11683 1726853256.42712: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11683 1726853256.42796: in run() - task 02083763-bbaf-c5b2-e075-00000000018d 11683 1726853256.42808: variable 'ansible_search_path' from source: unknown 11683 1726853256.42811: variable 'ansible_search_path' from source: unknown 11683 1726853256.42837: calling self._execute() 11683 1726853256.42899: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.42903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.42911: variable 'omit' from source: magic vars 11683 1726853256.43164: variable 'ansible_distribution_major_version' from source: facts 11683 1726853256.43174: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853256.43179: variable 'omit' from source: magic vars 11683 1726853256.43224: variable 'omit' from source: magic vars 11683 1726853256.43250: variable 'omit' from source: magic vars 11683 1726853256.43282: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853256.43307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853256.43325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853256.43396: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853256.43406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853256.43430: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853256.43438: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.43441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.43508: Set connection var ansible_shell_executable to /bin/sh 11683 1726853256.43517: Set connection var ansible_timeout to 10 11683 1726853256.43523: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853256.43528: Set connection var ansible_pipelining to False 11683 1726853256.43530: Set connection var ansible_shell_type to sh 11683 1726853256.43541: Set connection var ansible_connection to ssh 11683 1726853256.43554: variable 'ansible_shell_executable' from source: unknown 11683 1726853256.43557: variable 'ansible_connection' from source: unknown 11683 1726853256.43560: variable 'ansible_module_compression' from source: unknown 11683 1726853256.43562: variable 'ansible_shell_type' from source: unknown 11683 1726853256.43565: variable 'ansible_shell_executable' from source: unknown 11683 1726853256.43567: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853256.43569: variable 'ansible_pipelining' from source: unknown 11683 1726853256.43574: variable 'ansible_timeout' from source: unknown 11683 1726853256.43633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853256.43815: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853256.43820: variable 'omit' from source: magic vars 11683 1726853256.43823: starting attempt loop 11683 1726853256.43825: running the handler 11683 1726853256.43827: _low_level_execute_command(): starting 11683 1726853256.43831: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853256.44516: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853256.44521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853256.44524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853256.44610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853256.44700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853256.46422: stdout chunk (state=3): >>>/root <<< 11683 1726853256.46514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853256.46542: stderr chunk (state=3): >>><<< 11683 1726853256.46545: stdout chunk (state=3): >>><<< 11683 1726853256.46566: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853256.46579: _low_level_execute_command(): starting 11683 1726853256.46585: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715 `" && echo ansible-tmp-1726853256.4656637-12116-231367433597715="` echo /root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715 `" ) && sleep 0' 11683 1726853256.47123: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853256.47127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853256.47137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853256.47191: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853256.47220: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853256.49135: stdout chunk (state=3): >>>ansible-tmp-1726853256.4656637-12116-231367433597715=/root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715 <<< 11683 1726853256.49241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853256.49268: stderr chunk (state=3): >>><<< 11683 1726853256.49273: stdout chunk (state=3): >>><<< 11683 1726853256.49288: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853256.4656637-12116-231367433597715=/root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853256.49327: variable 'ansible_module_compression' from source: unknown 11683 1726853256.49362: ANSIBALLZ: Using lock for service_facts 11683 1726853256.49365: ANSIBALLZ: Acquiring lock 11683 1726853256.49367: ANSIBALLZ: Lock acquired: 139785057559200 11683 1726853256.49370: ANSIBALLZ: Creating module 11683 1726853256.57187: ANSIBALLZ: Writing module into payload 11683 1726853256.57251: ANSIBALLZ: Writing module 11683 1726853256.57275: ANSIBALLZ: Renaming module 11683 1726853256.57281: ANSIBALLZ: Done creating module 11683 1726853256.57297: variable 'ansible_facts' from source: unknown 11683 1726853256.57342: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/AnsiballZ_service_facts.py 11683 1726853256.57444: Sending initial data 11683 1726853256.57448: Sent initial data (162 bytes) 11683 1726853256.57909: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853256.57912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853256.57915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853256.57917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853256.57919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853256.57921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853256.57976: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853256.57979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853256.57981: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853256.58054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853256.59721: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853256.59780: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853256.59840: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpjwsxoa76 /root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/AnsiballZ_service_facts.py <<< 11683 1726853256.59843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/AnsiballZ_service_facts.py" <<< 11683 1726853256.59898: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpjwsxoa76" to remote "/root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/AnsiballZ_service_facts.py" <<< 11683 1726853256.60508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853256.60556: stderr chunk (state=3): >>><<< 11683 1726853256.60559: stdout chunk (state=3): >>><<< 11683 1726853256.60598: done transferring module to remote 11683 1726853256.60607: _low_level_execute_command(): starting 11683 1726853256.60612: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/ /root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/AnsiballZ_service_facts.py && sleep 0' 11683 1726853256.61075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853256.61078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853256.61081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853256.61083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 11683 1726853256.61089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853256.61091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853256.61139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853256.61146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853256.61148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853256.61201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853256.63047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853256.63067: stderr chunk (state=3): >>><<< 11683 1726853256.63072: stdout chunk (state=3): >>><<< 11683 1726853256.63084: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853256.63088: _low_level_execute_command(): starting 11683 1726853256.63094: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/AnsiballZ_service_facts.py && sleep 0' 11683 1726853256.63530: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853256.63533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853256.63535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853256.63537: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853256.63539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853256.63594: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853256.63599: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853256.63663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853258.24586: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 11683 1726853258.24606: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 11683 1726853258.24637: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11683 1726853258.26151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853258.26184: stderr chunk (state=3): >>><<< 11683 1726853258.26187: stdout chunk (state=3): >>><<< 11683 1726853258.26206: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853258.27433: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853258.27441: _low_level_execute_command(): starting 11683 1726853258.27448: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853256.4656637-12116-231367433597715/ > /dev/null 2>&1 && sleep 0' 11683 1726853258.27912: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853258.27916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853258.27918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853258.27920: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853258.27922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.27968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853258.27985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853258.28042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853258.29918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853258.29948: stderr chunk (state=3): >>><<< 11683 1726853258.29951: stdout chunk (state=3): >>><<< 11683 1726853258.29963: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853258.29969: handler run complete 11683 1726853258.30079: variable 'ansible_facts' from source: unknown 11683 1726853258.30173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853258.30428: variable 'ansible_facts' from source: unknown 11683 1726853258.30506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853258.30619: attempt loop complete, returning result 11683 1726853258.30624: _execute() done 11683 1726853258.30626: dumping result to json 11683 1726853258.30662: done dumping result, returning 11683 1726853258.30673: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-c5b2-e075-00000000018d] 11683 1726853258.30676: sending task result for task 02083763-bbaf-c5b2-e075-00000000018d ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853258.31257: no more pending results, returning what we have 11683 1726853258.31259: results queue empty 11683 1726853258.31260: checking for any_errors_fatal 11683 1726853258.31263: done checking for any_errors_fatal 11683 1726853258.31264: checking for max_fail_percentage 11683 1726853258.31265: done checking for max_fail_percentage 11683 1726853258.31266: checking to see if all hosts have failed and the running result is not ok 11683 1726853258.31267: done checking to see if all hosts have failed 11683 1726853258.31267: getting the remaining hosts for this loop 11683 1726853258.31268: done getting the remaining hosts for this loop 11683 1726853258.31273: getting the next task for host managed_node3 11683 1726853258.31278: done getting next task for host managed_node3 11683 1726853258.31285: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11683 1726853258.31288: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853258.31297: getting variables 11683 1726853258.31298: in VariableManager get_vars() 11683 1726853258.31322: Calling all_inventory to load vars for managed_node3 11683 1726853258.31323: Calling groups_inventory to load vars for managed_node3 11683 1726853258.31325: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853258.31332: Calling all_plugins_play to load vars for managed_node3 11683 1726853258.31334: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853258.31337: Calling groups_plugins_play to load vars for managed_node3 11683 1726853258.31665: done sending task result for task 02083763-bbaf-c5b2-e075-00000000018d 11683 1726853258.31668: WORKER PROCESS EXITING 11683 1726853258.31680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853258.31950: done with get_vars() 11683 1726853258.31959: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:27:38 -0400 (0:00:01.896) 0:00:11.392 ****** 11683 1726853258.32027: entering _queue_task() for managed_node3/package_facts 11683 1726853258.32029: Creating lock for package_facts 11683 1726853258.32253: worker is 1 (out of 1 available) 11683 1726853258.32267: exiting _queue_task() for managed_node3/package_facts 11683 1726853258.32280: done queuing things up, now waiting for results queue to drain 11683 1726853258.32281: waiting for pending results... 11683 1726853258.32437: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11683 1726853258.32528: in run() - task 02083763-bbaf-c5b2-e075-00000000018e 11683 1726853258.32540: variable 'ansible_search_path' from source: unknown 11683 1726853258.32543: variable 'ansible_search_path' from source: unknown 11683 1726853258.32569: calling self._execute() 11683 1726853258.32632: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853258.32636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853258.32648: variable 'omit' from source: magic vars 11683 1726853258.32900: variable 'ansible_distribution_major_version' from source: facts 11683 1726853258.32910: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853258.32915: variable 'omit' from source: magic vars 11683 1726853258.32964: variable 'omit' from source: magic vars 11683 1726853258.32989: variable 'omit' from source: magic vars 11683 1726853258.33019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853258.33049: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853258.33065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853258.33080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853258.33090: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853258.33113: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853258.33116: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853258.33119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853258.33188: Set connection var ansible_shell_executable to /bin/sh 11683 1726853258.33197: Set connection var ansible_timeout to 10 11683 1726853258.33203: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853258.33207: Set connection var ansible_pipelining to False 11683 1726853258.33210: Set connection var ansible_shell_type to sh 11683 1726853258.33212: Set connection var ansible_connection to ssh 11683 1726853258.33228: variable 'ansible_shell_executable' from source: unknown 11683 1726853258.33230: variable 'ansible_connection' from source: unknown 11683 1726853258.33233: variable 'ansible_module_compression' from source: unknown 11683 1726853258.33236: variable 'ansible_shell_type' from source: unknown 11683 1726853258.33238: variable 'ansible_shell_executable' from source: unknown 11683 1726853258.33240: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853258.33242: variable 'ansible_pipelining' from source: unknown 11683 1726853258.33247: variable 'ansible_timeout' from source: unknown 11683 1726853258.33250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853258.33390: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853258.33394: variable 'omit' from source: magic vars 11683 1726853258.33400: starting attempt loop 11683 1726853258.33403: running the handler 11683 1726853258.33414: _low_level_execute_command(): starting 11683 1726853258.33420: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853258.33932: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853258.33935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.33939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853258.33942: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.34000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853258.34003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853258.34006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853258.34074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853258.35761: stdout chunk (state=3): >>>/root <<< 11683 1726853258.35855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853258.35889: stderr chunk (state=3): >>><<< 11683 1726853258.35892: stdout chunk (state=3): >>><<< 11683 1726853258.35911: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853258.35922: _low_level_execute_command(): starting 11683 1726853258.35929: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941 `" && echo ansible-tmp-1726853258.3591194-12182-170177814602941="` echo /root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941 `" ) && sleep 0' 11683 1726853258.36382: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853258.36385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.36396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853258.36398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.36443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853258.36450: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853258.36515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853258.38470: stdout chunk (state=3): >>>ansible-tmp-1726853258.3591194-12182-170177814602941=/root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941 <<< 11683 1726853258.38570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853258.38602: stderr chunk (state=3): >>><<< 11683 1726853258.38605: stdout chunk (state=3): >>><<< 11683 1726853258.38620: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853258.3591194-12182-170177814602941=/root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853258.38662: variable 'ansible_module_compression' from source: unknown 11683 1726853258.38707: ANSIBALLZ: Using lock for package_facts 11683 1726853258.38711: ANSIBALLZ: Acquiring lock 11683 1726853258.38713: ANSIBALLZ: Lock acquired: 139785059399152 11683 1726853258.38716: ANSIBALLZ: Creating module 11683 1726853258.56742: ANSIBALLZ: Writing module into payload 11683 1726853258.56836: ANSIBALLZ: Writing module 11683 1726853258.56859: ANSIBALLZ: Renaming module 11683 1726853258.56865: ANSIBALLZ: Done creating module 11683 1726853258.56886: variable 'ansible_facts' from source: unknown 11683 1726853258.57006: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/AnsiballZ_package_facts.py 11683 1726853258.57114: Sending initial data 11683 1726853258.57117: Sent initial data (162 bytes) 11683 1726853258.57585: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853258.57588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.57592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853258.57594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.57643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853258.57649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853258.57654: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853258.57720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853258.59423: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853258.59475: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853258.59538: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpn6mu9avi /root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/AnsiballZ_package_facts.py <<< 11683 1726853258.59541: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/AnsiballZ_package_facts.py" <<< 11683 1726853258.59600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpn6mu9avi" to remote "/root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/AnsiballZ_package_facts.py" <<< 11683 1726853258.59602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/AnsiballZ_package_facts.py" <<< 11683 1726853258.60750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853258.60794: stderr chunk (state=3): >>><<< 11683 1726853258.60798: stdout chunk (state=3): >>><<< 11683 1726853258.60840: done transferring module to remote 11683 1726853258.60852: _low_level_execute_command(): starting 11683 1726853258.60857: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/ /root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/AnsiballZ_package_facts.py && sleep 0' 11683 1726853258.61315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853258.61319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853258.61321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.61327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853258.61329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.61377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853258.61391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853258.61444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853258.63309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853258.63336: stderr chunk (state=3): >>><<< 11683 1726853258.63339: stdout chunk (state=3): >>><<< 11683 1726853258.63353: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853258.63356: _low_level_execute_command(): starting 11683 1726853258.63361: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/AnsiballZ_package_facts.py && sleep 0' 11683 1726853258.63807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853258.63811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.63825: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853258.63887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853258.63895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853258.63957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853259.08820: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11683 1726853259.08889: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 11683 1726853259.08944: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 11683 1726853259.09008: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11683 1726853259.09035: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11683 1726853259.10878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853259.10882: stdout chunk (state=3): >>><<< 11683 1726853259.10884: stderr chunk (state=3): >>><<< 11683 1726853259.10986: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853259.13160: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853259.13179: _low_level_execute_command(): starting 11683 1726853259.13183: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853258.3591194-12182-170177814602941/ > /dev/null 2>&1 && sleep 0' 11683 1726853259.13667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853259.13673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853259.13716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853259.13780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853259.16082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853259.16087: stdout chunk (state=3): >>><<< 11683 1726853259.16089: stderr chunk (state=3): >>><<< 11683 1726853259.16092: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853259.16094: handler run complete 11683 1726853259.17239: variable 'ansible_facts' from source: unknown 11683 1726853259.17694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.19858: variable 'ansible_facts' from source: unknown 11683 1726853259.20610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.21914: attempt loop complete, returning result 11683 1726853259.21934: _execute() done 11683 1726853259.21947: dumping result to json 11683 1726853259.22061: done dumping result, returning 11683 1726853259.22069: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-c5b2-e075-00000000018e] 11683 1726853259.22074: sending task result for task 02083763-bbaf-c5b2-e075-00000000018e 11683 1726853259.23282: done sending task result for task 02083763-bbaf-c5b2-e075-00000000018e 11683 1726853259.23286: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853259.23327: no more pending results, returning what we have 11683 1726853259.23328: results queue empty 11683 1726853259.23329: checking for any_errors_fatal 11683 1726853259.23331: done checking for any_errors_fatal 11683 1726853259.23332: checking for max_fail_percentage 11683 1726853259.23333: done checking for max_fail_percentage 11683 1726853259.23333: checking to see if all hosts have failed and the running result is not ok 11683 1726853259.23334: done checking to see if all hosts have failed 11683 1726853259.23335: getting the remaining hosts for this loop 11683 1726853259.23335: done getting the remaining hosts for this loop 11683 1726853259.23338: getting the next task for host managed_node3 11683 1726853259.23342: done getting next task for host managed_node3 11683 1726853259.23345: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11683 1726853259.23347: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853259.23354: getting variables 11683 1726853259.23355: in VariableManager get_vars() 11683 1726853259.23380: Calling all_inventory to load vars for managed_node3 11683 1726853259.23382: Calling groups_inventory to load vars for managed_node3 11683 1726853259.23383: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853259.23389: Calling all_plugins_play to load vars for managed_node3 11683 1726853259.23391: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853259.23393: Calling groups_plugins_play to load vars for managed_node3 11683 1726853259.24589: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.25934: done with get_vars() 11683 1726853259.25956: done getting variables 11683 1726853259.26003: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:27:39 -0400 (0:00:00.940) 0:00:12.332 ****** 11683 1726853259.26028: entering _queue_task() for managed_node3/debug 11683 1726853259.26267: worker is 1 (out of 1 available) 11683 1726853259.26283: exiting _queue_task() for managed_node3/debug 11683 1726853259.26296: done queuing things up, now waiting for results queue to drain 11683 1726853259.26297: waiting for pending results... 11683 1726853259.26468: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11683 1726853259.26548: in run() - task 02083763-bbaf-c5b2-e075-000000000027 11683 1726853259.26562: variable 'ansible_search_path' from source: unknown 11683 1726853259.26565: variable 'ansible_search_path' from source: unknown 11683 1726853259.26596: calling self._execute() 11683 1726853259.26660: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.26664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.26674: variable 'omit' from source: magic vars 11683 1726853259.26937: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.26945: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853259.26955: variable 'omit' from source: magic vars 11683 1726853259.26992: variable 'omit' from source: magic vars 11683 1726853259.27060: variable 'network_provider' from source: set_fact 11683 1726853259.27077: variable 'omit' from source: magic vars 11683 1726853259.27112: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853259.27137: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853259.27176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853259.27198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853259.27208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853259.27234: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853259.27245: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.27248: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.27358: Set connection var ansible_shell_executable to /bin/sh 11683 1726853259.27362: Set connection var ansible_timeout to 10 11683 1726853259.27364: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853259.27367: Set connection var ansible_pipelining to False 11683 1726853259.27369: Set connection var ansible_shell_type to sh 11683 1726853259.27373: Set connection var ansible_connection to ssh 11683 1726853259.27404: variable 'ansible_shell_executable' from source: unknown 11683 1726853259.27407: variable 'ansible_connection' from source: unknown 11683 1726853259.27410: variable 'ansible_module_compression' from source: unknown 11683 1726853259.27412: variable 'ansible_shell_type' from source: unknown 11683 1726853259.27415: variable 'ansible_shell_executable' from source: unknown 11683 1726853259.27417: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.27419: variable 'ansible_pipelining' from source: unknown 11683 1726853259.27423: variable 'ansible_timeout' from source: unknown 11683 1726853259.27425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.27677: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853259.27681: variable 'omit' from source: magic vars 11683 1726853259.27684: starting attempt loop 11683 1726853259.27687: running the handler 11683 1726853259.27689: handler run complete 11683 1726853259.27691: attempt loop complete, returning result 11683 1726853259.27693: _execute() done 11683 1726853259.27695: dumping result to json 11683 1726853259.27696: done dumping result, returning 11683 1726853259.27698: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-c5b2-e075-000000000027] 11683 1726853259.27700: sending task result for task 02083763-bbaf-c5b2-e075-000000000027 11683 1726853259.27760: done sending task result for task 02083763-bbaf-c5b2-e075-000000000027 11683 1726853259.27763: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 11683 1726853259.27836: no more pending results, returning what we have 11683 1726853259.27840: results queue empty 11683 1726853259.27841: checking for any_errors_fatal 11683 1726853259.27851: done checking for any_errors_fatal 11683 1726853259.27852: checking for max_fail_percentage 11683 1726853259.27854: done checking for max_fail_percentage 11683 1726853259.27856: checking to see if all hosts have failed and the running result is not ok 11683 1726853259.27857: done checking to see if all hosts have failed 11683 1726853259.27858: getting the remaining hosts for this loop 11683 1726853259.27860: done getting the remaining hosts for this loop 11683 1726853259.27863: getting the next task for host managed_node3 11683 1726853259.27873: done getting next task for host managed_node3 11683 1726853259.27877: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11683 1726853259.27880: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853259.27892: getting variables 11683 1726853259.27894: in VariableManager get_vars() 11683 1726853259.28048: Calling all_inventory to load vars for managed_node3 11683 1726853259.28051: Calling groups_inventory to load vars for managed_node3 11683 1726853259.28053: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853259.28063: Calling all_plugins_play to load vars for managed_node3 11683 1726853259.28066: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853259.28069: Calling groups_plugins_play to load vars for managed_node3 11683 1726853259.29246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.30113: done with get_vars() 11683 1726853259.30132: done getting variables 11683 1726853259.30201: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:27:39 -0400 (0:00:00.041) 0:00:12.374 ****** 11683 1726853259.30227: entering _queue_task() for managed_node3/fail 11683 1726853259.30228: Creating lock for fail 11683 1726853259.30469: worker is 1 (out of 1 available) 11683 1726853259.30486: exiting _queue_task() for managed_node3/fail 11683 1726853259.30498: done queuing things up, now waiting for results queue to drain 11683 1726853259.30500: waiting for pending results... 11683 1726853259.30721: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11683 1726853259.30803: in run() - task 02083763-bbaf-c5b2-e075-000000000028 11683 1726853259.30814: variable 'ansible_search_path' from source: unknown 11683 1726853259.30819: variable 'ansible_search_path' from source: unknown 11683 1726853259.30862: calling self._execute() 11683 1726853259.30954: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.30958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.30967: variable 'omit' from source: magic vars 11683 1726853259.31479: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.31482: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853259.31484: variable 'network_state' from source: role '' defaults 11683 1726853259.31500: Evaluated conditional (network_state != {}): False 11683 1726853259.31511: when evaluation is False, skipping this task 11683 1726853259.31514: _execute() done 11683 1726853259.31517: dumping result to json 11683 1726853259.31519: done dumping result, returning 11683 1726853259.31527: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-c5b2-e075-000000000028] 11683 1726853259.31530: sending task result for task 02083763-bbaf-c5b2-e075-000000000028 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853259.31694: no more pending results, returning what we have 11683 1726853259.31698: results queue empty 11683 1726853259.31699: checking for any_errors_fatal 11683 1726853259.31705: done checking for any_errors_fatal 11683 1726853259.31706: checking for max_fail_percentage 11683 1726853259.31707: done checking for max_fail_percentage 11683 1726853259.31708: checking to see if all hosts have failed and the running result is not ok 11683 1726853259.31710: done checking to see if all hosts have failed 11683 1726853259.31710: getting the remaining hosts for this loop 11683 1726853259.31712: done getting the remaining hosts for this loop 11683 1726853259.31737: getting the next task for host managed_node3 11683 1726853259.31746: done getting next task for host managed_node3 11683 1726853259.31750: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11683 1726853259.31753: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853259.31769: getting variables 11683 1726853259.31772: in VariableManager get_vars() 11683 1726853259.31808: Calling all_inventory to load vars for managed_node3 11683 1726853259.31811: Calling groups_inventory to load vars for managed_node3 11683 1726853259.31813: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853259.31822: Calling all_plugins_play to load vars for managed_node3 11683 1726853259.31851: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853259.31857: done sending task result for task 02083763-bbaf-c5b2-e075-000000000028 11683 1726853259.31859: WORKER PROCESS EXITING 11683 1726853259.31863: Calling groups_plugins_play to load vars for managed_node3 11683 1726853259.33103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.34060: done with get_vars() 11683 1726853259.34080: done getting variables 11683 1726853259.34126: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:27:39 -0400 (0:00:00.039) 0:00:12.413 ****** 11683 1726853259.34152: entering _queue_task() for managed_node3/fail 11683 1726853259.34406: worker is 1 (out of 1 available) 11683 1726853259.34419: exiting _queue_task() for managed_node3/fail 11683 1726853259.34429: done queuing things up, now waiting for results queue to drain 11683 1726853259.34431: waiting for pending results... 11683 1726853259.34605: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11683 1726853259.34684: in run() - task 02083763-bbaf-c5b2-e075-000000000029 11683 1726853259.34696: variable 'ansible_search_path' from source: unknown 11683 1726853259.34700: variable 'ansible_search_path' from source: unknown 11683 1726853259.34729: calling self._execute() 11683 1726853259.34797: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.34802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.34811: variable 'omit' from source: magic vars 11683 1726853259.35261: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.35264: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853259.35333: variable 'network_state' from source: role '' defaults 11683 1726853259.35347: Evaluated conditional (network_state != {}): False 11683 1726853259.35354: when evaluation is False, skipping this task 11683 1726853259.35360: _execute() done 11683 1726853259.35373: dumping result to json 11683 1726853259.35385: done dumping result, returning 11683 1726853259.35395: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-c5b2-e075-000000000029] 11683 1726853259.35404: sending task result for task 02083763-bbaf-c5b2-e075-000000000029 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853259.35661: no more pending results, returning what we have 11683 1726853259.35666: results queue empty 11683 1726853259.35666: checking for any_errors_fatal 11683 1726853259.35678: done checking for any_errors_fatal 11683 1726853259.35679: checking for max_fail_percentage 11683 1726853259.35681: done checking for max_fail_percentage 11683 1726853259.35682: checking to see if all hosts have failed and the running result is not ok 11683 1726853259.35684: done checking to see if all hosts have failed 11683 1726853259.35685: getting the remaining hosts for this loop 11683 1726853259.35686: done getting the remaining hosts for this loop 11683 1726853259.35691: getting the next task for host managed_node3 11683 1726853259.35697: done getting next task for host managed_node3 11683 1726853259.35702: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11683 1726853259.35705: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853259.35724: getting variables 11683 1726853259.35726: in VariableManager get_vars() 11683 1726853259.35769: Calling all_inventory to load vars for managed_node3 11683 1726853259.35829: Calling groups_inventory to load vars for managed_node3 11683 1726853259.35832: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853259.35847: Calling all_plugins_play to load vars for managed_node3 11683 1726853259.35850: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853259.35853: Calling groups_plugins_play to load vars for managed_node3 11683 1726853259.36377: done sending task result for task 02083763-bbaf-c5b2-e075-000000000029 11683 1726853259.36381: WORKER PROCESS EXITING 11683 1726853259.36690: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.37543: done with get_vars() 11683 1726853259.37561: done getting variables 11683 1726853259.37604: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:27:39 -0400 (0:00:00.034) 0:00:12.448 ****** 11683 1726853259.37629: entering _queue_task() for managed_node3/fail 11683 1726853259.37859: worker is 1 (out of 1 available) 11683 1726853259.37876: exiting _queue_task() for managed_node3/fail 11683 1726853259.37888: done queuing things up, now waiting for results queue to drain 11683 1726853259.37889: waiting for pending results... 11683 1726853259.38059: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11683 1726853259.38145: in run() - task 02083763-bbaf-c5b2-e075-00000000002a 11683 1726853259.38159: variable 'ansible_search_path' from source: unknown 11683 1726853259.38163: variable 'ansible_search_path' from source: unknown 11683 1726853259.38192: calling self._execute() 11683 1726853259.38257: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.38260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.38269: variable 'omit' from source: magic vars 11683 1726853259.38535: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.38546: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853259.38668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853259.40153: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853259.40206: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853259.40233: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853259.40259: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853259.40284: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853259.40339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.40363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.40382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.40412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.40423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.40490: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.40505: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11683 1726853259.40582: variable 'ansible_distribution' from source: facts 11683 1726853259.40585: variable '__network_rh_distros' from source: role '' defaults 11683 1726853259.40593: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11683 1726853259.40747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.40765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.40784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.40808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.40818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.40856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.40873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.40892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.40916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.40925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.40959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.40977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.40993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.41017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.41027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.41217: variable 'network_connections' from source: task vars 11683 1726853259.41226: variable 'controller_profile' from source: play vars 11683 1726853259.41275: variable 'controller_profile' from source: play vars 11683 1726853259.41284: variable 'controller_device' from source: play vars 11683 1726853259.41325: variable 'controller_device' from source: play vars 11683 1726853259.41342: variable 'port1_profile' from source: play vars 11683 1726853259.41403: variable 'port1_profile' from source: play vars 11683 1726853259.41409: variable 'dhcp_interface1' from source: play vars 11683 1726853259.41453: variable 'dhcp_interface1' from source: play vars 11683 1726853259.41459: variable 'controller_profile' from source: play vars 11683 1726853259.41503: variable 'controller_profile' from source: play vars 11683 1726853259.41509: variable 'port2_profile' from source: play vars 11683 1726853259.41551: variable 'port2_profile' from source: play vars 11683 1726853259.41558: variable 'dhcp_interface2' from source: play vars 11683 1726853259.41600: variable 'dhcp_interface2' from source: play vars 11683 1726853259.41609: variable 'controller_profile' from source: play vars 11683 1726853259.41917: variable 'controller_profile' from source: play vars 11683 1726853259.41924: variable 'network_state' from source: role '' defaults 11683 1726853259.41980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853259.42102: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853259.42129: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853259.42155: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853259.42179: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853259.42212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853259.42229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853259.42250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.42269: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853259.42300: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11683 1726853259.42303: when evaluation is False, skipping this task 11683 1726853259.42306: _execute() done 11683 1726853259.42309: dumping result to json 11683 1726853259.42311: done dumping result, returning 11683 1726853259.42318: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-c5b2-e075-00000000002a] 11683 1726853259.42321: sending task result for task 02083763-bbaf-c5b2-e075-00000000002a 11683 1726853259.42407: done sending task result for task 02083763-bbaf-c5b2-e075-00000000002a 11683 1726853259.42409: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11683 1726853259.42461: no more pending results, returning what we have 11683 1726853259.42464: results queue empty 11683 1726853259.42465: checking for any_errors_fatal 11683 1726853259.42469: done checking for any_errors_fatal 11683 1726853259.42470: checking for max_fail_percentage 11683 1726853259.42473: done checking for max_fail_percentage 11683 1726853259.42474: checking to see if all hosts have failed and the running result is not ok 11683 1726853259.42475: done checking to see if all hosts have failed 11683 1726853259.42476: getting the remaining hosts for this loop 11683 1726853259.42477: done getting the remaining hosts for this loop 11683 1726853259.42481: getting the next task for host managed_node3 11683 1726853259.42487: done getting next task for host managed_node3 11683 1726853259.42491: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11683 1726853259.42493: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853259.42507: getting variables 11683 1726853259.42508: in VariableManager get_vars() 11683 1726853259.42552: Calling all_inventory to load vars for managed_node3 11683 1726853259.42554: Calling groups_inventory to load vars for managed_node3 11683 1726853259.42556: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853259.42566: Calling all_plugins_play to load vars for managed_node3 11683 1726853259.42568: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853259.42577: Calling groups_plugins_play to load vars for managed_node3 11683 1726853259.43912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.44826: done with get_vars() 11683 1726853259.44846: done getting variables 11683 1726853259.44919: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:27:39 -0400 (0:00:00.073) 0:00:12.521 ****** 11683 1726853259.44945: entering _queue_task() for managed_node3/dnf 11683 1726853259.45194: worker is 1 (out of 1 available) 11683 1726853259.45210: exiting _queue_task() for managed_node3/dnf 11683 1726853259.45220: done queuing things up, now waiting for results queue to drain 11683 1726853259.45222: waiting for pending results... 11683 1726853259.45394: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11683 1726853259.45474: in run() - task 02083763-bbaf-c5b2-e075-00000000002b 11683 1726853259.45485: variable 'ansible_search_path' from source: unknown 11683 1726853259.45489: variable 'ansible_search_path' from source: unknown 11683 1726853259.45516: calling self._execute() 11683 1726853259.45583: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.45587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.45595: variable 'omit' from source: magic vars 11683 1726853259.45867: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.45976: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853259.46086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853259.48153: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853259.48225: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853259.48265: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853259.48304: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853259.48333: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853259.48415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.48449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.48493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.48537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.48555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.48776: variable 'ansible_distribution' from source: facts 11683 1726853259.48779: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.48781: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11683 1726853259.48817: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853259.48946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.48977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.49005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.49047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.49066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.49109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.49134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.49160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.49203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.49219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.49259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.49288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.49314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.49353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.49370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.49523: variable 'network_connections' from source: task vars 11683 1726853259.49540: variable 'controller_profile' from source: play vars 11683 1726853259.49607: variable 'controller_profile' from source: play vars 11683 1726853259.49776: variable 'controller_device' from source: play vars 11683 1726853259.49779: variable 'controller_device' from source: play vars 11683 1726853259.49781: variable 'port1_profile' from source: play vars 11683 1726853259.49783: variable 'port1_profile' from source: play vars 11683 1726853259.49785: variable 'dhcp_interface1' from source: play vars 11683 1726853259.49818: variable 'dhcp_interface1' from source: play vars 11683 1726853259.49833: variable 'controller_profile' from source: play vars 11683 1726853259.49894: variable 'controller_profile' from source: play vars 11683 1726853259.49905: variable 'port2_profile' from source: play vars 11683 1726853259.49963: variable 'port2_profile' from source: play vars 11683 1726853259.49978: variable 'dhcp_interface2' from source: play vars 11683 1726853259.50036: variable 'dhcp_interface2' from source: play vars 11683 1726853259.50050: variable 'controller_profile' from source: play vars 11683 1726853259.50111: variable 'controller_profile' from source: play vars 11683 1726853259.50213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853259.50377: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853259.50414: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853259.50443: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853259.50473: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853259.50516: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853259.50537: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853259.50562: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.50591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853259.50659: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853259.50900: variable 'network_connections' from source: task vars 11683 1726853259.50912: variable 'controller_profile' from source: play vars 11683 1726853259.50976: variable 'controller_profile' from source: play vars 11683 1726853259.50990: variable 'controller_device' from source: play vars 11683 1726853259.51057: variable 'controller_device' from source: play vars 11683 1726853259.51073: variable 'port1_profile' from source: play vars 11683 1726853259.51140: variable 'port1_profile' from source: play vars 11683 1726853259.51176: variable 'dhcp_interface1' from source: play vars 11683 1726853259.51214: variable 'dhcp_interface1' from source: play vars 11683 1726853259.51231: variable 'controller_profile' from source: play vars 11683 1726853259.51294: variable 'controller_profile' from source: play vars 11683 1726853259.51334: variable 'port2_profile' from source: play vars 11683 1726853259.51374: variable 'port2_profile' from source: play vars 11683 1726853259.51387: variable 'dhcp_interface2' from source: play vars 11683 1726853259.51450: variable 'dhcp_interface2' from source: play vars 11683 1726853259.51551: variable 'controller_profile' from source: play vars 11683 1726853259.51554: variable 'controller_profile' from source: play vars 11683 1726853259.51566: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11683 1726853259.51576: when evaluation is False, skipping this task 11683 1726853259.51585: _execute() done 11683 1726853259.51592: dumping result to json 11683 1726853259.51599: done dumping result, returning 11683 1726853259.51611: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-c5b2-e075-00000000002b] 11683 1726853259.51620: sending task result for task 02083763-bbaf-c5b2-e075-00000000002b 11683 1726853259.51876: done sending task result for task 02083763-bbaf-c5b2-e075-00000000002b 11683 1726853259.51879: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11683 1726853259.51934: no more pending results, returning what we have 11683 1726853259.51938: results queue empty 11683 1726853259.51939: checking for any_errors_fatal 11683 1726853259.51946: done checking for any_errors_fatal 11683 1726853259.51947: checking for max_fail_percentage 11683 1726853259.51949: done checking for max_fail_percentage 11683 1726853259.51950: checking to see if all hosts have failed and the running result is not ok 11683 1726853259.51951: done checking to see if all hosts have failed 11683 1726853259.51952: getting the remaining hosts for this loop 11683 1726853259.51954: done getting the remaining hosts for this loop 11683 1726853259.51958: getting the next task for host managed_node3 11683 1726853259.51965: done getting next task for host managed_node3 11683 1726853259.51970: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11683 1726853259.51975: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853259.51991: getting variables 11683 1726853259.51993: in VariableManager get_vars() 11683 1726853259.52035: Calling all_inventory to load vars for managed_node3 11683 1726853259.52038: Calling groups_inventory to load vars for managed_node3 11683 1726853259.52041: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853259.52053: Calling all_plugins_play to load vars for managed_node3 11683 1726853259.52056: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853259.52060: Calling groups_plugins_play to load vars for managed_node3 11683 1726853259.54102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.55795: done with get_vars() 11683 1726853259.55827: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11683 1726853259.55904: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:27:39 -0400 (0:00:00.109) 0:00:12.631 ****** 11683 1726853259.55936: entering _queue_task() for managed_node3/yum 11683 1726853259.55938: Creating lock for yum 11683 1726853259.56266: worker is 1 (out of 1 available) 11683 1726853259.56281: exiting _queue_task() for managed_node3/yum 11683 1726853259.56293: done queuing things up, now waiting for results queue to drain 11683 1726853259.56295: waiting for pending results... 11683 1726853259.56592: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11683 1726853259.56710: in run() - task 02083763-bbaf-c5b2-e075-00000000002c 11683 1726853259.56731: variable 'ansible_search_path' from source: unknown 11683 1726853259.56740: variable 'ansible_search_path' from source: unknown 11683 1726853259.56781: calling self._execute() 11683 1726853259.56864: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.56877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.56894: variable 'omit' from source: magic vars 11683 1726853259.57260: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.57278: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853259.57455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853259.60392: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853259.60557: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853259.60562: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853259.60564: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853259.60566: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853259.60646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.60687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.60712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.60748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.60764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.60854: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.60880: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11683 1726853259.60888: when evaluation is False, skipping this task 11683 1726853259.60895: _execute() done 11683 1726853259.60901: dumping result to json 11683 1726853259.60908: done dumping result, returning 11683 1726853259.60918: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-c5b2-e075-00000000002c] 11683 1726853259.60926: sending task result for task 02083763-bbaf-c5b2-e075-00000000002c skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11683 1726853259.61183: no more pending results, returning what we have 11683 1726853259.61188: results queue empty 11683 1726853259.61189: checking for any_errors_fatal 11683 1726853259.61194: done checking for any_errors_fatal 11683 1726853259.61195: checking for max_fail_percentage 11683 1726853259.61197: done checking for max_fail_percentage 11683 1726853259.61198: checking to see if all hosts have failed and the running result is not ok 11683 1726853259.61200: done checking to see if all hosts have failed 11683 1726853259.61200: getting the remaining hosts for this loop 11683 1726853259.61202: done getting the remaining hosts for this loop 11683 1726853259.61206: getting the next task for host managed_node3 11683 1726853259.61215: done getting next task for host managed_node3 11683 1726853259.61220: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11683 1726853259.61223: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853259.61239: getting variables 11683 1726853259.61241: in VariableManager get_vars() 11683 1726853259.61291: Calling all_inventory to load vars for managed_node3 11683 1726853259.61294: Calling groups_inventory to load vars for managed_node3 11683 1726853259.61297: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853259.61309: Calling all_plugins_play to load vars for managed_node3 11683 1726853259.61312: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853259.61316: Calling groups_plugins_play to load vars for managed_node3 11683 1726853259.62084: done sending task result for task 02083763-bbaf-c5b2-e075-00000000002c 11683 1726853259.62088: WORKER PROCESS EXITING 11683 1726853259.62987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.64502: done with get_vars() 11683 1726853259.64527: done getting variables 11683 1726853259.64587: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:27:39 -0400 (0:00:00.086) 0:00:12.718 ****** 11683 1726853259.64620: entering _queue_task() for managed_node3/fail 11683 1726853259.64951: worker is 1 (out of 1 available) 11683 1726853259.64962: exiting _queue_task() for managed_node3/fail 11683 1726853259.64977: done queuing things up, now waiting for results queue to drain 11683 1726853259.64978: waiting for pending results... 11683 1726853259.65256: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11683 1726853259.65396: in run() - task 02083763-bbaf-c5b2-e075-00000000002d 11683 1726853259.65421: variable 'ansible_search_path' from source: unknown 11683 1726853259.65433: variable 'ansible_search_path' from source: unknown 11683 1726853259.65477: calling self._execute() 11683 1726853259.65562: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.65573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.65587: variable 'omit' from source: magic vars 11683 1726853259.65946: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.65967: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853259.66073: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853259.66253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853259.69812: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853259.69949: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853259.70022: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853259.70060: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853259.70095: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853259.70182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.70223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.70254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.70300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.70318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.70374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.70405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.70437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.70482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.70501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.70549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.70581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.70609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.70676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.70679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.70853: variable 'network_connections' from source: task vars 11683 1726853259.70877: variable 'controller_profile' from source: play vars 11683 1726853259.70950: variable 'controller_profile' from source: play vars 11683 1726853259.70977: variable 'controller_device' from source: play vars 11683 1726853259.71032: variable 'controller_device' from source: play vars 11683 1726853259.71077: variable 'port1_profile' from source: play vars 11683 1726853259.71114: variable 'port1_profile' from source: play vars 11683 1726853259.71124: variable 'dhcp_interface1' from source: play vars 11683 1726853259.71185: variable 'dhcp_interface1' from source: play vars 11683 1726853259.71201: variable 'controller_profile' from source: play vars 11683 1726853259.71302: variable 'controller_profile' from source: play vars 11683 1726853259.71305: variable 'port2_profile' from source: play vars 11683 1726853259.71332: variable 'port2_profile' from source: play vars 11683 1726853259.71342: variable 'dhcp_interface2' from source: play vars 11683 1726853259.71399: variable 'dhcp_interface2' from source: play vars 11683 1726853259.71415: variable 'controller_profile' from source: play vars 11683 1726853259.71477: variable 'controller_profile' from source: play vars 11683 1726853259.71551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853259.71735: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853259.71844: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853259.71847: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853259.71849: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853259.71883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853259.71906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853259.71931: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.71963: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853259.72038: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853259.72273: variable 'network_connections' from source: task vars 11683 1726853259.72286: variable 'controller_profile' from source: play vars 11683 1726853259.72342: variable 'controller_profile' from source: play vars 11683 1726853259.72354: variable 'controller_device' from source: play vars 11683 1726853259.72417: variable 'controller_device' from source: play vars 11683 1726853259.72476: variable 'port1_profile' from source: play vars 11683 1726853259.72495: variable 'port1_profile' from source: play vars 11683 1726853259.72511: variable 'dhcp_interface1' from source: play vars 11683 1726853259.72574: variable 'dhcp_interface1' from source: play vars 11683 1726853259.72588: variable 'controller_profile' from source: play vars 11683 1726853259.72652: variable 'controller_profile' from source: play vars 11683 1726853259.72664: variable 'port2_profile' from source: play vars 11683 1726853259.72725: variable 'port2_profile' from source: play vars 11683 1726853259.72737: variable 'dhcp_interface2' from source: play vars 11683 1726853259.72828: variable 'dhcp_interface2' from source: play vars 11683 1726853259.72832: variable 'controller_profile' from source: play vars 11683 1726853259.72867: variable 'controller_profile' from source: play vars 11683 1726853259.72908: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11683 1726853259.72917: when evaluation is False, skipping this task 11683 1726853259.72925: _execute() done 11683 1726853259.72936: dumping result to json 11683 1726853259.73046: done dumping result, returning 11683 1726853259.73050: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-c5b2-e075-00000000002d] 11683 1726853259.73052: sending task result for task 02083763-bbaf-c5b2-e075-00000000002d 11683 1726853259.73124: done sending task result for task 02083763-bbaf-c5b2-e075-00000000002d 11683 1726853259.73127: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11683 1726853259.73202: no more pending results, returning what we have 11683 1726853259.73205: results queue empty 11683 1726853259.73206: checking for any_errors_fatal 11683 1726853259.73211: done checking for any_errors_fatal 11683 1726853259.73212: checking for max_fail_percentage 11683 1726853259.73214: done checking for max_fail_percentage 11683 1726853259.73215: checking to see if all hosts have failed and the running result is not ok 11683 1726853259.73216: done checking to see if all hosts have failed 11683 1726853259.73216: getting the remaining hosts for this loop 11683 1726853259.73218: done getting the remaining hosts for this loop 11683 1726853259.73221: getting the next task for host managed_node3 11683 1726853259.73227: done getting next task for host managed_node3 11683 1726853259.73231: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11683 1726853259.73234: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853259.73247: getting variables 11683 1726853259.73249: in VariableManager get_vars() 11683 1726853259.73396: Calling all_inventory to load vars for managed_node3 11683 1726853259.73398: Calling groups_inventory to load vars for managed_node3 11683 1726853259.73401: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853259.73410: Calling all_plugins_play to load vars for managed_node3 11683 1726853259.73413: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853259.73415: Calling groups_plugins_play to load vars for managed_node3 11683 1726853259.76170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.78675: done with get_vars() 11683 1726853259.78699: done getting variables 11683 1726853259.78787: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:27:39 -0400 (0:00:00.141) 0:00:12.860 ****** 11683 1726853259.78821: entering _queue_task() for managed_node3/package 11683 1726853259.79165: worker is 1 (out of 1 available) 11683 1726853259.79280: exiting _queue_task() for managed_node3/package 11683 1726853259.79291: done queuing things up, now waiting for results queue to drain 11683 1726853259.79292: waiting for pending results... 11683 1726853259.79463: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11683 1726853259.79626: in run() - task 02083763-bbaf-c5b2-e075-00000000002e 11683 1726853259.79630: variable 'ansible_search_path' from source: unknown 11683 1726853259.79632: variable 'ansible_search_path' from source: unknown 11683 1726853259.79659: calling self._execute() 11683 1726853259.79746: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.79878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.79882: variable 'omit' from source: magic vars 11683 1726853259.80115: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.80129: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853259.80333: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853259.80651: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853259.80709: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853259.80747: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853259.80794: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853259.80927: variable 'network_packages' from source: role '' defaults 11683 1726853259.81218: variable '__network_provider_setup' from source: role '' defaults 11683 1726853259.81233: variable '__network_service_name_default_nm' from source: role '' defaults 11683 1726853259.81378: variable '__network_service_name_default_nm' from source: role '' defaults 11683 1726853259.81382: variable '__network_packages_default_nm' from source: role '' defaults 11683 1726853259.81384: variable '__network_packages_default_nm' from source: role '' defaults 11683 1726853259.81576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853259.83628: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853259.83709: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853259.83745: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853259.83783: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853259.83811: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853259.83893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.83923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.83948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.83994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.84013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.84057: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.84084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.84176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.84179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.84181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.84397: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11683 1726853259.84518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.84547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.84576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.84619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.84642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.84759: variable 'ansible_python' from source: facts 11683 1726853259.84793: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11683 1726853259.84977: variable '__network_wpa_supplicant_required' from source: role '' defaults 11683 1726853259.85277: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11683 1726853259.85340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.85407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.85523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.85567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.85678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.85704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853259.85777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853259.85811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.85854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853259.85875: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853259.86277: variable 'network_connections' from source: task vars 11683 1726853259.86281: variable 'controller_profile' from source: play vars 11683 1726853259.86476: variable 'controller_profile' from source: play vars 11683 1726853259.86479: variable 'controller_device' from source: play vars 11683 1726853259.86778: variable 'controller_device' from source: play vars 11683 1726853259.86782: variable 'port1_profile' from source: play vars 11683 1726853259.86882: variable 'port1_profile' from source: play vars 11683 1726853259.86899: variable 'dhcp_interface1' from source: play vars 11683 1726853259.87111: variable 'dhcp_interface1' from source: play vars 11683 1726853259.87125: variable 'controller_profile' from source: play vars 11683 1726853259.87265: variable 'controller_profile' from source: play vars 11683 1726853259.87284: variable 'port2_profile' from source: play vars 11683 1726853259.87395: variable 'port2_profile' from source: play vars 11683 1726853259.87411: variable 'dhcp_interface2' from source: play vars 11683 1726853259.87513: variable 'dhcp_interface2' from source: play vars 11683 1726853259.87529: variable 'controller_profile' from source: play vars 11683 1726853259.87632: variable 'controller_profile' from source: play vars 11683 1726853259.87715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853259.87741: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853259.87778: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853259.87807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853259.87856: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853259.88143: variable 'network_connections' from source: task vars 11683 1726853259.88190: variable 'controller_profile' from source: play vars 11683 1726853259.88266: variable 'controller_profile' from source: play vars 11683 1726853259.88285: variable 'controller_device' from source: play vars 11683 1726853259.88391: variable 'controller_device' from source: play vars 11683 1726853259.88412: variable 'port1_profile' from source: play vars 11683 1726853259.88516: variable 'port1_profile' from source: play vars 11683 1726853259.88579: variable 'dhcp_interface1' from source: play vars 11683 1726853259.88643: variable 'dhcp_interface1' from source: play vars 11683 1726853259.88658: variable 'controller_profile' from source: play vars 11683 1726853259.88762: variable 'controller_profile' from source: play vars 11683 1726853259.88782: variable 'port2_profile' from source: play vars 11683 1726853259.88888: variable 'port2_profile' from source: play vars 11683 1726853259.88903: variable 'dhcp_interface2' from source: play vars 11683 1726853259.89008: variable 'dhcp_interface2' from source: play vars 11683 1726853259.89061: variable 'controller_profile' from source: play vars 11683 1726853259.89127: variable 'controller_profile' from source: play vars 11683 1726853259.89197: variable '__network_packages_default_wireless' from source: role '' defaults 11683 1726853259.89285: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853259.89614: variable 'network_connections' from source: task vars 11683 1726853259.89624: variable 'controller_profile' from source: play vars 11683 1726853259.89688: variable 'controller_profile' from source: play vars 11683 1726853259.89712: variable 'controller_device' from source: play vars 11683 1726853259.89770: variable 'controller_device' from source: play vars 11683 1726853259.89822: variable 'port1_profile' from source: play vars 11683 1726853259.89857: variable 'port1_profile' from source: play vars 11683 1726853259.89868: variable 'dhcp_interface1' from source: play vars 11683 1726853259.89937: variable 'dhcp_interface1' from source: play vars 11683 1726853259.89949: variable 'controller_profile' from source: play vars 11683 1726853259.90017: variable 'controller_profile' from source: play vars 11683 1726853259.90038: variable 'port2_profile' from source: play vars 11683 1726853259.90148: variable 'port2_profile' from source: play vars 11683 1726853259.90150: variable 'dhcp_interface2' from source: play vars 11683 1726853259.90181: variable 'dhcp_interface2' from source: play vars 11683 1726853259.90191: variable 'controller_profile' from source: play vars 11683 1726853259.90257: variable 'controller_profile' from source: play vars 11683 1726853259.90289: variable '__network_packages_default_team' from source: role '' defaults 11683 1726853259.90374: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853259.90695: variable 'network_connections' from source: task vars 11683 1726853259.90706: variable 'controller_profile' from source: play vars 11683 1726853259.90794: variable 'controller_profile' from source: play vars 11683 1726853259.90797: variable 'controller_device' from source: play vars 11683 1726853259.91144: variable 'controller_device' from source: play vars 11683 1726853259.91147: variable 'port1_profile' from source: play vars 11683 1726853259.91186: variable 'port1_profile' from source: play vars 11683 1726853259.91198: variable 'dhcp_interface1' from source: play vars 11683 1726853259.91554: variable 'dhcp_interface1' from source: play vars 11683 1726853259.91557: variable 'controller_profile' from source: play vars 11683 1726853259.91559: variable 'controller_profile' from source: play vars 11683 1726853259.91561: variable 'port2_profile' from source: play vars 11683 1726853259.91664: variable 'port2_profile' from source: play vars 11683 1726853259.91680: variable 'dhcp_interface2' from source: play vars 11683 1726853259.91742: variable 'dhcp_interface2' from source: play vars 11683 1726853259.91784: variable 'controller_profile' from source: play vars 11683 1726853259.91936: variable 'controller_profile' from source: play vars 11683 1726853259.92114: variable '__network_service_name_default_initscripts' from source: role '' defaults 11683 1726853259.92176: variable '__network_service_name_default_initscripts' from source: role '' defaults 11683 1726853259.92211: variable '__network_packages_default_initscripts' from source: role '' defaults 11683 1726853259.92307: variable '__network_packages_default_initscripts' from source: role '' defaults 11683 1726853259.92701: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11683 1726853259.93211: variable 'network_connections' from source: task vars 11683 1726853259.93221: variable 'controller_profile' from source: play vars 11683 1726853259.93283: variable 'controller_profile' from source: play vars 11683 1726853259.93303: variable 'controller_device' from source: play vars 11683 1726853259.93361: variable 'controller_device' from source: play vars 11683 1726853259.93378: variable 'port1_profile' from source: play vars 11683 1726853259.93441: variable 'port1_profile' from source: play vars 11683 1726853259.93452: variable 'dhcp_interface1' from source: play vars 11683 1726853259.93520: variable 'dhcp_interface1' from source: play vars 11683 1726853259.93530: variable 'controller_profile' from source: play vars 11683 1726853259.93590: variable 'controller_profile' from source: play vars 11683 1726853259.93601: variable 'port2_profile' from source: play vars 11683 1726853259.93664: variable 'port2_profile' from source: play vars 11683 1726853259.93726: variable 'dhcp_interface2' from source: play vars 11683 1726853259.93741: variable 'dhcp_interface2' from source: play vars 11683 1726853259.93751: variable 'controller_profile' from source: play vars 11683 1726853259.93811: variable 'controller_profile' from source: play vars 11683 1726853259.93824: variable 'ansible_distribution' from source: facts 11683 1726853259.93835: variable '__network_rh_distros' from source: role '' defaults 11683 1726853259.93845: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.93876: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11683 1726853259.94039: variable 'ansible_distribution' from source: facts 11683 1726853259.94052: variable '__network_rh_distros' from source: role '' defaults 11683 1726853259.94159: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.94162: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11683 1726853259.94243: variable 'ansible_distribution' from source: facts 11683 1726853259.94252: variable '__network_rh_distros' from source: role '' defaults 11683 1726853259.94262: variable 'ansible_distribution_major_version' from source: facts 11683 1726853259.94304: variable 'network_provider' from source: set_fact 11683 1726853259.94377: variable 'ansible_facts' from source: unknown 11683 1726853259.95262: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11683 1726853259.95276: when evaluation is False, skipping this task 11683 1726853259.95378: _execute() done 11683 1726853259.95382: dumping result to json 11683 1726853259.95384: done dumping result, returning 11683 1726853259.95387: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-c5b2-e075-00000000002e] 11683 1726853259.95389: sending task result for task 02083763-bbaf-c5b2-e075-00000000002e 11683 1726853259.95460: done sending task result for task 02083763-bbaf-c5b2-e075-00000000002e 11683 1726853259.95463: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11683 1726853259.95538: no more pending results, returning what we have 11683 1726853259.95543: results queue empty 11683 1726853259.95544: checking for any_errors_fatal 11683 1726853259.95550: done checking for any_errors_fatal 11683 1726853259.95551: checking for max_fail_percentage 11683 1726853259.95553: done checking for max_fail_percentage 11683 1726853259.95554: checking to see if all hosts have failed and the running result is not ok 11683 1726853259.95556: done checking to see if all hosts have failed 11683 1726853259.95557: getting the remaining hosts for this loop 11683 1726853259.95559: done getting the remaining hosts for this loop 11683 1726853259.95563: getting the next task for host managed_node3 11683 1726853259.95572: done getting next task for host managed_node3 11683 1726853259.95577: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11683 1726853259.95580: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853259.95595: getting variables 11683 1726853259.95597: in VariableManager get_vars() 11683 1726853259.95637: Calling all_inventory to load vars for managed_node3 11683 1726853259.95640: Calling groups_inventory to load vars for managed_node3 11683 1726853259.95643: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853259.95654: Calling all_plugins_play to load vars for managed_node3 11683 1726853259.95657: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853259.95660: Calling groups_plugins_play to load vars for managed_node3 11683 1726853259.97407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853259.98905: done with get_vars() 11683 1726853259.98938: done getting variables 11683 1726853259.99003: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:27:39 -0400 (0:00:00.202) 0:00:13.062 ****** 11683 1726853259.99029: entering _queue_task() for managed_node3/package 11683 1726853259.99278: worker is 1 (out of 1 available) 11683 1726853259.99293: exiting _queue_task() for managed_node3/package 11683 1726853259.99305: done queuing things up, now waiting for results queue to drain 11683 1726853259.99306: waiting for pending results... 11683 1726853259.99484: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11683 1726853259.99678: in run() - task 02083763-bbaf-c5b2-e075-00000000002f 11683 1726853259.99682: variable 'ansible_search_path' from source: unknown 11683 1726853259.99685: variable 'ansible_search_path' from source: unknown 11683 1726853259.99687: calling self._execute() 11683 1726853259.99725: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853259.99738: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853259.99755: variable 'omit' from source: magic vars 11683 1726853260.00256: variable 'ansible_distribution_major_version' from source: facts 11683 1726853260.00277: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853260.00422: variable 'network_state' from source: role '' defaults 11683 1726853260.00439: Evaluated conditional (network_state != {}): False 11683 1726853260.00448: when evaluation is False, skipping this task 11683 1726853260.00458: _execute() done 11683 1726853260.00466: dumping result to json 11683 1726853260.00477: done dumping result, returning 11683 1726853260.00491: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-c5b2-e075-00000000002f] 11683 1726853260.00502: sending task result for task 02083763-bbaf-c5b2-e075-00000000002f skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853260.00684: no more pending results, returning what we have 11683 1726853260.00689: results queue empty 11683 1726853260.00690: checking for any_errors_fatal 11683 1726853260.00695: done checking for any_errors_fatal 11683 1726853260.00696: checking for max_fail_percentage 11683 1726853260.00698: done checking for max_fail_percentage 11683 1726853260.00699: checking to see if all hosts have failed and the running result is not ok 11683 1726853260.00700: done checking to see if all hosts have failed 11683 1726853260.00701: getting the remaining hosts for this loop 11683 1726853260.00703: done getting the remaining hosts for this loop 11683 1726853260.00707: getting the next task for host managed_node3 11683 1726853260.00714: done getting next task for host managed_node3 11683 1726853260.00721: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11683 1726853260.00727: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853260.00745: getting variables 11683 1726853260.00747: in VariableManager get_vars() 11683 1726853260.01002: Calling all_inventory to load vars for managed_node3 11683 1726853260.01005: Calling groups_inventory to load vars for managed_node3 11683 1726853260.01007: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853260.01019: Calling all_plugins_play to load vars for managed_node3 11683 1726853260.01023: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853260.01027: Calling groups_plugins_play to load vars for managed_node3 11683 1726853260.01680: done sending task result for task 02083763-bbaf-c5b2-e075-00000000002f 11683 1726853260.01686: WORKER PROCESS EXITING 11683 1726853260.02963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853260.08454: done with get_vars() 11683 1726853260.08485: done getting variables 11683 1726853260.08532: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:27:40 -0400 (0:00:00.095) 0:00:13.157 ****** 11683 1726853260.08564: entering _queue_task() for managed_node3/package 11683 1726853260.08904: worker is 1 (out of 1 available) 11683 1726853260.08917: exiting _queue_task() for managed_node3/package 11683 1726853260.08929: done queuing things up, now waiting for results queue to drain 11683 1726853260.08930: waiting for pending results... 11683 1726853260.09210: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11683 1726853260.09360: in run() - task 02083763-bbaf-c5b2-e075-000000000030 11683 1726853260.09383: variable 'ansible_search_path' from source: unknown 11683 1726853260.09396: variable 'ansible_search_path' from source: unknown 11683 1726853260.09436: calling self._execute() 11683 1726853260.09532: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853260.09548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853260.09569: variable 'omit' from source: magic vars 11683 1726853260.09857: variable 'ansible_distribution_major_version' from source: facts 11683 1726853260.09867: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853260.09951: variable 'network_state' from source: role '' defaults 11683 1726853260.09960: Evaluated conditional (network_state != {}): False 11683 1726853260.09963: when evaluation is False, skipping this task 11683 1726853260.09966: _execute() done 11683 1726853260.09968: dumping result to json 11683 1726853260.09972: done dumping result, returning 11683 1726853260.09980: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-c5b2-e075-000000000030] 11683 1726853260.09985: sending task result for task 02083763-bbaf-c5b2-e075-000000000030 skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853260.10121: no more pending results, returning what we have 11683 1726853260.10124: results queue empty 11683 1726853260.10125: checking for any_errors_fatal 11683 1726853260.10131: done checking for any_errors_fatal 11683 1726853260.10132: checking for max_fail_percentage 11683 1726853260.10134: done checking for max_fail_percentage 11683 1726853260.10134: checking to see if all hosts have failed and the running result is not ok 11683 1726853260.10136: done checking to see if all hosts have failed 11683 1726853260.10137: getting the remaining hosts for this loop 11683 1726853260.10139: done getting the remaining hosts for this loop 11683 1726853260.10141: getting the next task for host managed_node3 11683 1726853260.10147: done getting next task for host managed_node3 11683 1726853260.10152: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11683 1726853260.10155: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853260.10170: getting variables 11683 1726853260.10173: in VariableManager get_vars() 11683 1726853260.10209: Calling all_inventory to load vars for managed_node3 11683 1726853260.10211: Calling groups_inventory to load vars for managed_node3 11683 1726853260.10213: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853260.10222: Calling all_plugins_play to load vars for managed_node3 11683 1726853260.10225: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853260.10227: Calling groups_plugins_play to load vars for managed_node3 11683 1726853260.10786: done sending task result for task 02083763-bbaf-c5b2-e075-000000000030 11683 1726853260.10789: WORKER PROCESS EXITING 11683 1726853260.10963: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853260.12362: done with get_vars() 11683 1726853260.12380: done getting variables 11683 1726853260.12449: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:27:40 -0400 (0:00:00.039) 0:00:13.196 ****** 11683 1726853260.12476: entering _queue_task() for managed_node3/service 11683 1726853260.12477: Creating lock for service 11683 1726853260.12709: worker is 1 (out of 1 available) 11683 1726853260.12724: exiting _queue_task() for managed_node3/service 11683 1726853260.12735: done queuing things up, now waiting for results queue to drain 11683 1726853260.12736: waiting for pending results... 11683 1726853260.12905: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11683 1726853260.12991: in run() - task 02083763-bbaf-c5b2-e075-000000000031 11683 1726853260.13002: variable 'ansible_search_path' from source: unknown 11683 1726853260.13006: variable 'ansible_search_path' from source: unknown 11683 1726853260.13033: calling self._execute() 11683 1726853260.13103: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853260.13106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853260.13116: variable 'omit' from source: magic vars 11683 1726853260.13394: variable 'ansible_distribution_major_version' from source: facts 11683 1726853260.13406: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853260.13485: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853260.13777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853260.15435: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853260.15722: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853260.15749: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853260.15776: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853260.15800: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853260.15858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853260.15881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853260.15900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.15928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853260.15938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853260.15976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853260.15994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853260.16013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.16038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853260.16051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853260.16080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853260.16097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853260.16115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.16139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853260.16153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853260.16263: variable 'network_connections' from source: task vars 11683 1726853260.16275: variable 'controller_profile' from source: play vars 11683 1726853260.16324: variable 'controller_profile' from source: play vars 11683 1726853260.16333: variable 'controller_device' from source: play vars 11683 1726853260.16378: variable 'controller_device' from source: play vars 11683 1726853260.16386: variable 'port1_profile' from source: play vars 11683 1726853260.16427: variable 'port1_profile' from source: play vars 11683 1726853260.16433: variable 'dhcp_interface1' from source: play vars 11683 1726853260.16479: variable 'dhcp_interface1' from source: play vars 11683 1726853260.16484: variable 'controller_profile' from source: play vars 11683 1726853260.16527: variable 'controller_profile' from source: play vars 11683 1726853260.16530: variable 'port2_profile' from source: play vars 11683 1726853260.16582: variable 'port2_profile' from source: play vars 11683 1726853260.16603: variable 'dhcp_interface2' from source: play vars 11683 1726853260.16676: variable 'dhcp_interface2' from source: play vars 11683 1726853260.16679: variable 'controller_profile' from source: play vars 11683 1726853260.16805: variable 'controller_profile' from source: play vars 11683 1726853260.16808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853260.16934: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853260.17138: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853260.17141: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853260.17144: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853260.17146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853260.17148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853260.17150: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.17152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853260.17380: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853260.17428: variable 'network_connections' from source: task vars 11683 1726853260.17431: variable 'controller_profile' from source: play vars 11683 1726853260.17488: variable 'controller_profile' from source: play vars 11683 1726853260.17491: variable 'controller_device' from source: play vars 11683 1726853260.17546: variable 'controller_device' from source: play vars 11683 1726853260.17556: variable 'port1_profile' from source: play vars 11683 1726853260.17608: variable 'port1_profile' from source: play vars 11683 1726853260.17615: variable 'dhcp_interface1' from source: play vars 11683 1726853260.17674: variable 'dhcp_interface1' from source: play vars 11683 1726853260.17680: variable 'controller_profile' from source: play vars 11683 1726853260.17733: variable 'controller_profile' from source: play vars 11683 1726853260.17739: variable 'port2_profile' from source: play vars 11683 1726853260.17799: variable 'port2_profile' from source: play vars 11683 1726853260.17806: variable 'dhcp_interface2' from source: play vars 11683 1726853260.17865: variable 'dhcp_interface2' from source: play vars 11683 1726853260.17872: variable 'controller_profile' from source: play vars 11683 1726853260.17927: variable 'controller_profile' from source: play vars 11683 1726853260.17961: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11683 1726853260.17964: when evaluation is False, skipping this task 11683 1726853260.17967: _execute() done 11683 1726853260.17969: dumping result to json 11683 1726853260.17973: done dumping result, returning 11683 1726853260.18033: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-c5b2-e075-000000000031] 11683 1726853260.18036: sending task result for task 02083763-bbaf-c5b2-e075-000000000031 11683 1726853260.18096: done sending task result for task 02083763-bbaf-c5b2-e075-000000000031 11683 1726853260.18099: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11683 1726853260.18178: no more pending results, returning what we have 11683 1726853260.18182: results queue empty 11683 1726853260.18183: checking for any_errors_fatal 11683 1726853260.18189: done checking for any_errors_fatal 11683 1726853260.18189: checking for max_fail_percentage 11683 1726853260.18191: done checking for max_fail_percentage 11683 1726853260.18192: checking to see if all hosts have failed and the running result is not ok 11683 1726853260.18193: done checking to see if all hosts have failed 11683 1726853260.18193: getting the remaining hosts for this loop 11683 1726853260.18195: done getting the remaining hosts for this loop 11683 1726853260.18198: getting the next task for host managed_node3 11683 1726853260.18204: done getting next task for host managed_node3 11683 1726853260.18208: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11683 1726853260.18210: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853260.18224: getting variables 11683 1726853260.18226: in VariableManager get_vars() 11683 1726853260.18262: Calling all_inventory to load vars for managed_node3 11683 1726853260.18265: Calling groups_inventory to load vars for managed_node3 11683 1726853260.18267: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853260.18280: Calling all_plugins_play to load vars for managed_node3 11683 1726853260.18283: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853260.18286: Calling groups_plugins_play to load vars for managed_node3 11683 1726853260.19788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853260.21361: done with get_vars() 11683 1726853260.21391: done getting variables 11683 1726853260.21461: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:27:40 -0400 (0:00:00.090) 0:00:13.286 ****** 11683 1726853260.21494: entering _queue_task() for managed_node3/service 11683 1726853260.21830: worker is 1 (out of 1 available) 11683 1726853260.21842: exiting _queue_task() for managed_node3/service 11683 1726853260.21854: done queuing things up, now waiting for results queue to drain 11683 1726853260.21855: waiting for pending results... 11683 1726853260.22199: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11683 1726853260.22280: in run() - task 02083763-bbaf-c5b2-e075-000000000032 11683 1726853260.22378: variable 'ansible_search_path' from source: unknown 11683 1726853260.22381: variable 'ansible_search_path' from source: unknown 11683 1726853260.22383: calling self._execute() 11683 1726853260.22460: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853260.22475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853260.22492: variable 'omit' from source: magic vars 11683 1726853260.22916: variable 'ansible_distribution_major_version' from source: facts 11683 1726853260.22934: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853260.23116: variable 'network_provider' from source: set_fact 11683 1726853260.23127: variable 'network_state' from source: role '' defaults 11683 1726853260.23141: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11683 1726853260.23153: variable 'omit' from source: magic vars 11683 1726853260.23221: variable 'omit' from source: magic vars 11683 1726853260.23278: variable 'network_service_name' from source: role '' defaults 11683 1726853260.23331: variable 'network_service_name' from source: role '' defaults 11683 1726853260.23445: variable '__network_provider_setup' from source: role '' defaults 11683 1726853260.23496: variable '__network_service_name_default_nm' from source: role '' defaults 11683 1726853260.23530: variable '__network_service_name_default_nm' from source: role '' defaults 11683 1726853260.23544: variable '__network_packages_default_nm' from source: role '' defaults 11683 1726853260.23618: variable '__network_packages_default_nm' from source: role '' defaults 11683 1726853260.23855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853260.26109: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853260.26136: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853260.26183: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853260.26231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853260.26261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853260.26352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853260.26393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853260.26423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.26479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853260.26543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853260.26555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853260.26585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853260.26613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.26662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853260.26683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853260.26931: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11683 1726853260.27060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853260.27176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853260.27179: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.27181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853260.27183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853260.27254: variable 'ansible_python' from source: facts 11683 1726853260.27281: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11683 1726853260.27385: variable '__network_wpa_supplicant_required' from source: role '' defaults 11683 1726853260.27480: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11683 1726853260.27620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853260.27658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853260.27691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.27736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853260.27761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853260.27822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853260.27977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853260.27980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.27983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853260.27985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853260.28107: variable 'network_connections' from source: task vars 11683 1726853260.28120: variable 'controller_profile' from source: play vars 11683 1726853260.28200: variable 'controller_profile' from source: play vars 11683 1726853260.28226: variable 'controller_device' from source: play vars 11683 1726853260.28303: variable 'controller_device' from source: play vars 11683 1726853260.28329: variable 'port1_profile' from source: play vars 11683 1726853260.28404: variable 'port1_profile' from source: play vars 11683 1726853260.28420: variable 'dhcp_interface1' from source: play vars 11683 1726853260.28504: variable 'dhcp_interface1' from source: play vars 11683 1726853260.28519: variable 'controller_profile' from source: play vars 11683 1726853260.28597: variable 'controller_profile' from source: play vars 11683 1726853260.28646: variable 'port2_profile' from source: play vars 11683 1726853260.28682: variable 'port2_profile' from source: play vars 11683 1726853260.28696: variable 'dhcp_interface2' from source: play vars 11683 1726853260.28768: variable 'dhcp_interface2' from source: play vars 11683 1726853260.28785: variable 'controller_profile' from source: play vars 11683 1726853260.28866: variable 'controller_profile' from source: play vars 11683 1726853260.29077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853260.29199: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853260.29250: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853260.29304: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853260.29349: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853260.29423: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853260.29458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853260.29497: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853260.29542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853260.29597: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853260.29902: variable 'network_connections' from source: task vars 11683 1726853260.29914: variable 'controller_profile' from source: play vars 11683 1726853260.30064: variable 'controller_profile' from source: play vars 11683 1726853260.30068: variable 'controller_device' from source: play vars 11683 1726853260.30093: variable 'controller_device' from source: play vars 11683 1726853260.30110: variable 'port1_profile' from source: play vars 11683 1726853260.30190: variable 'port1_profile' from source: play vars 11683 1726853260.30206: variable 'dhcp_interface1' from source: play vars 11683 1726853260.30284: variable 'dhcp_interface1' from source: play vars 11683 1726853260.30304: variable 'controller_profile' from source: play vars 11683 1726853260.30378: variable 'controller_profile' from source: play vars 11683 1726853260.30403: variable 'port2_profile' from source: play vars 11683 1726853260.30479: variable 'port2_profile' from source: play vars 11683 1726853260.30511: variable 'dhcp_interface2' from source: play vars 11683 1726853260.30576: variable 'dhcp_interface2' from source: play vars 11683 1726853260.30591: variable 'controller_profile' from source: play vars 11683 1726853260.30729: variable 'controller_profile' from source: play vars 11683 1726853260.30732: variable '__network_packages_default_wireless' from source: role '' defaults 11683 1726853260.30811: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853260.31128: variable 'network_connections' from source: task vars 11683 1726853260.31138: variable 'controller_profile' from source: play vars 11683 1726853260.31218: variable 'controller_profile' from source: play vars 11683 1726853260.31231: variable 'controller_device' from source: play vars 11683 1726853260.31308: variable 'controller_device' from source: play vars 11683 1726853260.31321: variable 'port1_profile' from source: play vars 11683 1726853260.31397: variable 'port1_profile' from source: play vars 11683 1726853260.31409: variable 'dhcp_interface1' from source: play vars 11683 1726853260.31576: variable 'dhcp_interface1' from source: play vars 11683 1726853260.31580: variable 'controller_profile' from source: play vars 11683 1726853260.31583: variable 'controller_profile' from source: play vars 11683 1726853260.31585: variable 'port2_profile' from source: play vars 11683 1726853260.31654: variable 'port2_profile' from source: play vars 11683 1726853260.31666: variable 'dhcp_interface2' from source: play vars 11683 1726853260.31746: variable 'dhcp_interface2' from source: play vars 11683 1726853260.31759: variable 'controller_profile' from source: play vars 11683 1726853260.31838: variable 'controller_profile' from source: play vars 11683 1726853260.31873: variable '__network_packages_default_team' from source: role '' defaults 11683 1726853260.31961: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853260.32305: variable 'network_connections' from source: task vars 11683 1726853260.32315: variable 'controller_profile' from source: play vars 11683 1726853260.32395: variable 'controller_profile' from source: play vars 11683 1726853260.32408: variable 'controller_device' from source: play vars 11683 1726853260.32485: variable 'controller_device' from source: play vars 11683 1726853260.32500: variable 'port1_profile' from source: play vars 11683 1726853260.32568: variable 'port1_profile' from source: play vars 11683 1726853260.32680: variable 'dhcp_interface1' from source: play vars 11683 1726853260.32683: variable 'dhcp_interface1' from source: play vars 11683 1726853260.32686: variable 'controller_profile' from source: play vars 11683 1726853260.32735: variable 'controller_profile' from source: play vars 11683 1726853260.32747: variable 'port2_profile' from source: play vars 11683 1726853260.32826: variable 'port2_profile' from source: play vars 11683 1726853260.32838: variable 'dhcp_interface2' from source: play vars 11683 1726853260.32914: variable 'dhcp_interface2' from source: play vars 11683 1726853260.32926: variable 'controller_profile' from source: play vars 11683 1726853260.32995: variable 'controller_profile' from source: play vars 11683 1726853260.33073: variable '__network_service_name_default_initscripts' from source: role '' defaults 11683 1726853260.33176: variable '__network_service_name_default_initscripts' from source: role '' defaults 11683 1726853260.33179: variable '__network_packages_default_initscripts' from source: role '' defaults 11683 1726853260.33225: variable '__network_packages_default_initscripts' from source: role '' defaults 11683 1726853260.33455: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11683 1726853260.33956: variable 'network_connections' from source: task vars 11683 1726853260.33966: variable 'controller_profile' from source: play vars 11683 1726853260.34076: variable 'controller_profile' from source: play vars 11683 1726853260.34079: variable 'controller_device' from source: play vars 11683 1726853260.34111: variable 'controller_device' from source: play vars 11683 1726853260.34123: variable 'port1_profile' from source: play vars 11683 1726853260.34183: variable 'port1_profile' from source: play vars 11683 1726853260.34195: variable 'dhcp_interface1' from source: play vars 11683 1726853260.34261: variable 'dhcp_interface1' from source: play vars 11683 1726853260.34276: variable 'controller_profile' from source: play vars 11683 1726853260.34339: variable 'controller_profile' from source: play vars 11683 1726853260.34376: variable 'port2_profile' from source: play vars 11683 1726853260.34404: variable 'port2_profile' from source: play vars 11683 1726853260.34413: variable 'dhcp_interface2' from source: play vars 11683 1726853260.34474: variable 'dhcp_interface2' from source: play vars 11683 1726853260.34534: variable 'controller_profile' from source: play vars 11683 1726853260.34551: variable 'controller_profile' from source: play vars 11683 1726853260.34565: variable 'ansible_distribution' from source: facts 11683 1726853260.34576: variable '__network_rh_distros' from source: role '' defaults 11683 1726853260.34585: variable 'ansible_distribution_major_version' from source: facts 11683 1726853260.34617: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11683 1726853260.34824: variable 'ansible_distribution' from source: facts 11683 1726853260.34834: variable '__network_rh_distros' from source: role '' defaults 11683 1726853260.34844: variable 'ansible_distribution_major_version' from source: facts 11683 1726853260.34869: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11683 1726853260.35078: variable 'ansible_distribution' from source: facts 11683 1726853260.35090: variable '__network_rh_distros' from source: role '' defaults 11683 1726853260.35101: variable 'ansible_distribution_major_version' from source: facts 11683 1726853260.35150: variable 'network_provider' from source: set_fact 11683 1726853260.35182: variable 'omit' from source: magic vars 11683 1726853260.35222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853260.35277: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853260.35283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853260.35305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853260.35321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853260.35478: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853260.35481: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853260.35483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853260.35485: Set connection var ansible_shell_executable to /bin/sh 11683 1726853260.35502: Set connection var ansible_timeout to 10 11683 1726853260.35514: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853260.35523: Set connection var ansible_pipelining to False 11683 1726853260.35529: Set connection var ansible_shell_type to sh 11683 1726853260.35535: Set connection var ansible_connection to ssh 11683 1726853260.35563: variable 'ansible_shell_executable' from source: unknown 11683 1726853260.35573: variable 'ansible_connection' from source: unknown 11683 1726853260.35581: variable 'ansible_module_compression' from source: unknown 11683 1726853260.35588: variable 'ansible_shell_type' from source: unknown 11683 1726853260.35601: variable 'ansible_shell_executable' from source: unknown 11683 1726853260.35609: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853260.35617: variable 'ansible_pipelining' from source: unknown 11683 1726853260.35624: variable 'ansible_timeout' from source: unknown 11683 1726853260.35632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853260.35747: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853260.35820: variable 'omit' from source: magic vars 11683 1726853260.35823: starting attempt loop 11683 1726853260.35825: running the handler 11683 1726853260.35852: variable 'ansible_facts' from source: unknown 11683 1726853260.36643: _low_level_execute_command(): starting 11683 1726853260.36657: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853260.37385: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853260.37467: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853260.37527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853260.37625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853260.39360: stdout chunk (state=3): >>>/root <<< 11683 1726853260.39513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853260.39525: stdout chunk (state=3): >>><<< 11683 1726853260.39544: stderr chunk (state=3): >>><<< 11683 1726853260.39674: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853260.39683: _low_level_execute_command(): starting 11683 1726853260.39686: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382 `" && echo ansible-tmp-1726853260.3957658-12264-49659479640382="` echo /root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382 `" ) && sleep 0' 11683 1726853260.40258: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853260.40262: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853260.40276: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853260.40290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853260.40302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853260.40393: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853260.40411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853260.40424: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853260.40451: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853260.40520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853260.42794: stdout chunk (state=3): >>>ansible-tmp-1726853260.3957658-12264-49659479640382=/root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382 <<< 11683 1726853260.42848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853260.42852: stderr chunk (state=3): >>><<< 11683 1726853260.42857: stdout chunk (state=3): >>><<< 11683 1726853260.42882: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853260.3957658-12264-49659479640382=/root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853260.42914: variable 'ansible_module_compression' from source: unknown 11683 1726853260.42974: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11683 1726853260.42978: ANSIBALLZ: Acquiring lock 11683 1726853260.42981: ANSIBALLZ: Lock acquired: 139785061355968 11683 1726853260.42983: ANSIBALLZ: Creating module 11683 1726853260.68549: ANSIBALLZ: Writing module into payload 11683 1726853260.68716: ANSIBALLZ: Writing module 11683 1726853260.68744: ANSIBALLZ: Renaming module 11683 1726853260.68753: ANSIBALLZ: Done creating module 11683 1726853260.68779: variable 'ansible_facts' from source: unknown 11683 1726853260.68965: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/AnsiballZ_systemd.py 11683 1726853260.69194: Sending initial data 11683 1726853260.69197: Sent initial data (155 bytes) 11683 1726853260.69747: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853260.69763: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853260.69866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853260.69870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853260.69874: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853260.69876: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853260.69878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853260.69880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853260.69881: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853260.69883: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11683 1726853260.69885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853260.69887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853260.69888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853260.69937: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853260.69943: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853260.70038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853260.71712: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11683 1726853260.71727: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11683 1726853260.71738: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11683 1726853260.71776: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853260.71825: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853260.71916: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpr04el2s9 /root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/AnsiballZ_systemd.py <<< 11683 1726853260.71919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/AnsiballZ_systemd.py" <<< 11683 1726853260.71979: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpr04el2s9" to remote "/root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/AnsiballZ_systemd.py" <<< 11683 1726853260.73643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853260.73673: stderr chunk (state=3): >>><<< 11683 1726853260.73676: stdout chunk (state=3): >>><<< 11683 1726853260.73690: done transferring module to remote 11683 1726853260.73702: _low_level_execute_command(): starting 11683 1726853260.73705: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/ /root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/AnsiballZ_systemd.py && sleep 0' 11683 1726853260.74165: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853260.74168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853260.74178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853260.74181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853260.74183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853260.74185: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853260.74228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853260.74231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853260.74295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853260.76165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853260.76208: stderr chunk (state=3): >>><<< 11683 1726853260.76215: stdout chunk (state=3): >>><<< 11683 1726853260.76232: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853260.76268: _low_level_execute_command(): starting 11683 1726853260.76275: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/AnsiballZ_systemd.py && sleep 0' 11683 1726853260.76895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853260.76915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853260.76931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853260.76959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853260.77001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853260.77036: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853260.77069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853260.77092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853260.77104: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853260.77115: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11683 1726853260.77127: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853260.77169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853260.77181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853260.77231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853260.77246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853260.77332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853261.07150: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10334208", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3329695744", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "598487000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit<<< 11683 1726853261.07165: stdout chunk (state=3): >>>.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11683 1726853261.09150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853261.09176: stderr chunk (state=3): >>><<< 11683 1726853261.09179: stdout chunk (state=3): >>><<< 11683 1726853261.09196: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10334208", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3329695744", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "598487000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853261.09315: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853261.09321: _low_level_execute_command(): starting 11683 1726853261.09326: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853260.3957658-12264-49659479640382/ > /dev/null 2>&1 && sleep 0' 11683 1726853261.09780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853261.09783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853261.09785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.09787: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 11683 1726853261.09789: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853261.09791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.09843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853261.09846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853261.09848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853261.09913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853261.11828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853261.11856: stderr chunk (state=3): >>><<< 11683 1726853261.11860: stdout chunk (state=3): >>><<< 11683 1726853261.11875: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853261.11882: handler run complete 11683 1726853261.11920: attempt loop complete, returning result 11683 1726853261.11923: _execute() done 11683 1726853261.11925: dumping result to json 11683 1726853261.11938: done dumping result, returning 11683 1726853261.11946: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-c5b2-e075-000000000032] 11683 1726853261.11954: sending task result for task 02083763-bbaf-c5b2-e075-000000000032 11683 1726853261.12177: done sending task result for task 02083763-bbaf-c5b2-e075-000000000032 11683 1726853261.12180: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853261.12227: no more pending results, returning what we have 11683 1726853261.12230: results queue empty 11683 1726853261.12230: checking for any_errors_fatal 11683 1726853261.12236: done checking for any_errors_fatal 11683 1726853261.12237: checking for max_fail_percentage 11683 1726853261.12238: done checking for max_fail_percentage 11683 1726853261.12239: checking to see if all hosts have failed and the running result is not ok 11683 1726853261.12240: done checking to see if all hosts have failed 11683 1726853261.12241: getting the remaining hosts for this loop 11683 1726853261.12243: done getting the remaining hosts for this loop 11683 1726853261.12246: getting the next task for host managed_node3 11683 1726853261.12251: done getting next task for host managed_node3 11683 1726853261.12255: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11683 1726853261.12257: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853261.12267: getting variables 11683 1726853261.12269: in VariableManager get_vars() 11683 1726853261.12313: Calling all_inventory to load vars for managed_node3 11683 1726853261.12316: Calling groups_inventory to load vars for managed_node3 11683 1726853261.12318: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853261.12328: Calling all_plugins_play to load vars for managed_node3 11683 1726853261.12331: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853261.12333: Calling groups_plugins_play to load vars for managed_node3 11683 1726853261.13126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853261.14093: done with get_vars() 11683 1726853261.14110: done getting variables 11683 1726853261.14157: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:27:41 -0400 (0:00:00.926) 0:00:14.213 ****** 11683 1726853261.14184: entering _queue_task() for managed_node3/service 11683 1726853261.14444: worker is 1 (out of 1 available) 11683 1726853261.14456: exiting _queue_task() for managed_node3/service 11683 1726853261.14468: done queuing things up, now waiting for results queue to drain 11683 1726853261.14469: waiting for pending results... 11683 1726853261.14653: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11683 1726853261.14740: in run() - task 02083763-bbaf-c5b2-e075-000000000033 11683 1726853261.14755: variable 'ansible_search_path' from source: unknown 11683 1726853261.14758: variable 'ansible_search_path' from source: unknown 11683 1726853261.14788: calling self._execute() 11683 1726853261.14859: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853261.14863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853261.14874: variable 'omit' from source: magic vars 11683 1726853261.15160: variable 'ansible_distribution_major_version' from source: facts 11683 1726853261.15170: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853261.15251: variable 'network_provider' from source: set_fact 11683 1726853261.15254: Evaluated conditional (network_provider == "nm"): True 11683 1726853261.15318: variable '__network_wpa_supplicant_required' from source: role '' defaults 11683 1726853261.15384: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11683 1726853261.15503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853261.16976: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853261.17023: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853261.17052: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853261.17079: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853261.17101: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853261.17173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853261.17193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853261.17215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853261.17240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853261.17253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853261.17287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853261.17304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853261.17324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853261.17350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853261.17361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853261.17390: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853261.17406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853261.17426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853261.17452: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853261.17463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853261.17567: variable 'network_connections' from source: task vars 11683 1726853261.17580: variable 'controller_profile' from source: play vars 11683 1726853261.17630: variable 'controller_profile' from source: play vars 11683 1726853261.17643: variable 'controller_device' from source: play vars 11683 1726853261.17689: variable 'controller_device' from source: play vars 11683 1726853261.17697: variable 'port1_profile' from source: play vars 11683 1726853261.17738: variable 'port1_profile' from source: play vars 11683 1726853261.17749: variable 'dhcp_interface1' from source: play vars 11683 1726853261.17792: variable 'dhcp_interface1' from source: play vars 11683 1726853261.17798: variable 'controller_profile' from source: play vars 11683 1726853261.17839: variable 'controller_profile' from source: play vars 11683 1726853261.17845: variable 'port2_profile' from source: play vars 11683 1726853261.17893: variable 'port2_profile' from source: play vars 11683 1726853261.17900: variable 'dhcp_interface2' from source: play vars 11683 1726853261.17940: variable 'dhcp_interface2' from source: play vars 11683 1726853261.17946: variable 'controller_profile' from source: play vars 11683 1726853261.17993: variable 'controller_profile' from source: play vars 11683 1726853261.18042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853261.18159: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853261.18189: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853261.18213: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853261.18235: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853261.18270: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853261.18288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853261.18307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853261.18328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853261.18369: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853261.18534: variable 'network_connections' from source: task vars 11683 1726853261.18538: variable 'controller_profile' from source: play vars 11683 1726853261.18585: variable 'controller_profile' from source: play vars 11683 1726853261.18591: variable 'controller_device' from source: play vars 11683 1726853261.18634: variable 'controller_device' from source: play vars 11683 1726853261.18641: variable 'port1_profile' from source: play vars 11683 1726853261.18686: variable 'port1_profile' from source: play vars 11683 1726853261.18691: variable 'dhcp_interface1' from source: play vars 11683 1726853261.18732: variable 'dhcp_interface1' from source: play vars 11683 1726853261.18742: variable 'controller_profile' from source: play vars 11683 1726853261.18785: variable 'controller_profile' from source: play vars 11683 1726853261.18791: variable 'port2_profile' from source: play vars 11683 1726853261.18831: variable 'port2_profile' from source: play vars 11683 1726853261.18839: variable 'dhcp_interface2' from source: play vars 11683 1726853261.18885: variable 'dhcp_interface2' from source: play vars 11683 1726853261.18891: variable 'controller_profile' from source: play vars 11683 1726853261.18931: variable 'controller_profile' from source: play vars 11683 1726853261.18967: Evaluated conditional (__network_wpa_supplicant_required): False 11683 1726853261.18972: when evaluation is False, skipping this task 11683 1726853261.18975: _execute() done 11683 1726853261.18977: dumping result to json 11683 1726853261.18979: done dumping result, returning 11683 1726853261.18987: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-c5b2-e075-000000000033] 11683 1726853261.18991: sending task result for task 02083763-bbaf-c5b2-e075-000000000033 11683 1726853261.19081: done sending task result for task 02083763-bbaf-c5b2-e075-000000000033 11683 1726853261.19083: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11683 1726853261.19131: no more pending results, returning what we have 11683 1726853261.19134: results queue empty 11683 1726853261.19135: checking for any_errors_fatal 11683 1726853261.19157: done checking for any_errors_fatal 11683 1726853261.19158: checking for max_fail_percentage 11683 1726853261.19159: done checking for max_fail_percentage 11683 1726853261.19160: checking to see if all hosts have failed and the running result is not ok 11683 1726853261.19161: done checking to see if all hosts have failed 11683 1726853261.19162: getting the remaining hosts for this loop 11683 1726853261.19163: done getting the remaining hosts for this loop 11683 1726853261.19167: getting the next task for host managed_node3 11683 1726853261.19182: done getting next task for host managed_node3 11683 1726853261.19187: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11683 1726853261.19189: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853261.19204: getting variables 11683 1726853261.19206: in VariableManager get_vars() 11683 1726853261.19247: Calling all_inventory to load vars for managed_node3 11683 1726853261.19250: Calling groups_inventory to load vars for managed_node3 11683 1726853261.19252: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853261.19261: Calling all_plugins_play to load vars for managed_node3 11683 1726853261.19263: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853261.19266: Calling groups_plugins_play to load vars for managed_node3 11683 1726853261.20055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853261.20938: done with get_vars() 11683 1726853261.20958: done getting variables 11683 1726853261.21003: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:27:41 -0400 (0:00:00.068) 0:00:14.282 ****** 11683 1726853261.21029: entering _queue_task() for managed_node3/service 11683 1726853261.21283: worker is 1 (out of 1 available) 11683 1726853261.21296: exiting _queue_task() for managed_node3/service 11683 1726853261.21307: done queuing things up, now waiting for results queue to drain 11683 1726853261.21308: waiting for pending results... 11683 1726853261.21486: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11683 1726853261.21579: in run() - task 02083763-bbaf-c5b2-e075-000000000034 11683 1726853261.21591: variable 'ansible_search_path' from source: unknown 11683 1726853261.21594: variable 'ansible_search_path' from source: unknown 11683 1726853261.21623: calling self._execute() 11683 1726853261.21692: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853261.21696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853261.21705: variable 'omit' from source: magic vars 11683 1726853261.21978: variable 'ansible_distribution_major_version' from source: facts 11683 1726853261.21988: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853261.22067: variable 'network_provider' from source: set_fact 11683 1726853261.22073: Evaluated conditional (network_provider == "initscripts"): False 11683 1726853261.22076: when evaluation is False, skipping this task 11683 1726853261.22079: _execute() done 11683 1726853261.22082: dumping result to json 11683 1726853261.22084: done dumping result, returning 11683 1726853261.22097: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-c5b2-e075-000000000034] 11683 1726853261.22101: sending task result for task 02083763-bbaf-c5b2-e075-000000000034 11683 1726853261.22180: done sending task result for task 02083763-bbaf-c5b2-e075-000000000034 11683 1726853261.22183: WORKER PROCESS EXITING skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853261.22237: no more pending results, returning what we have 11683 1726853261.22241: results queue empty 11683 1726853261.22242: checking for any_errors_fatal 11683 1726853261.22251: done checking for any_errors_fatal 11683 1726853261.22252: checking for max_fail_percentage 11683 1726853261.22254: done checking for max_fail_percentage 11683 1726853261.22255: checking to see if all hosts have failed and the running result is not ok 11683 1726853261.22256: done checking to see if all hosts have failed 11683 1726853261.22256: getting the remaining hosts for this loop 11683 1726853261.22258: done getting the remaining hosts for this loop 11683 1726853261.22261: getting the next task for host managed_node3 11683 1726853261.22267: done getting next task for host managed_node3 11683 1726853261.22272: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11683 1726853261.22275: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853261.22290: getting variables 11683 1726853261.22292: in VariableManager get_vars() 11683 1726853261.22328: Calling all_inventory to load vars for managed_node3 11683 1726853261.22330: Calling groups_inventory to load vars for managed_node3 11683 1726853261.22332: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853261.22340: Calling all_plugins_play to load vars for managed_node3 11683 1726853261.22342: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853261.22344: Calling groups_plugins_play to load vars for managed_node3 11683 1726853261.23220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853261.24064: done with get_vars() 11683 1726853261.24081: done getting variables 11683 1726853261.24125: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:27:41 -0400 (0:00:00.031) 0:00:14.313 ****** 11683 1726853261.24149: entering _queue_task() for managed_node3/copy 11683 1726853261.24387: worker is 1 (out of 1 available) 11683 1726853261.24400: exiting _queue_task() for managed_node3/copy 11683 1726853261.24411: done queuing things up, now waiting for results queue to drain 11683 1726853261.24412: waiting for pending results... 11683 1726853261.24590: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11683 1726853261.24672: in run() - task 02083763-bbaf-c5b2-e075-000000000035 11683 1726853261.24684: variable 'ansible_search_path' from source: unknown 11683 1726853261.24687: variable 'ansible_search_path' from source: unknown 11683 1726853261.24714: calling self._execute() 11683 1726853261.24787: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853261.24791: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853261.24799: variable 'omit' from source: magic vars 11683 1726853261.25077: variable 'ansible_distribution_major_version' from source: facts 11683 1726853261.25089: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853261.25166: variable 'network_provider' from source: set_fact 11683 1726853261.25170: Evaluated conditional (network_provider == "initscripts"): False 11683 1726853261.25175: when evaluation is False, skipping this task 11683 1726853261.25178: _execute() done 11683 1726853261.25180: dumping result to json 11683 1726853261.25185: done dumping result, returning 11683 1726853261.25193: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-c5b2-e075-000000000035] 11683 1726853261.25197: sending task result for task 02083763-bbaf-c5b2-e075-000000000035 11683 1726853261.25285: done sending task result for task 02083763-bbaf-c5b2-e075-000000000035 11683 1726853261.25289: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11683 1726853261.25334: no more pending results, returning what we have 11683 1726853261.25337: results queue empty 11683 1726853261.25338: checking for any_errors_fatal 11683 1726853261.25344: done checking for any_errors_fatal 11683 1726853261.25344: checking for max_fail_percentage 11683 1726853261.25346: done checking for max_fail_percentage 11683 1726853261.25347: checking to see if all hosts have failed and the running result is not ok 11683 1726853261.25348: done checking to see if all hosts have failed 11683 1726853261.25349: getting the remaining hosts for this loop 11683 1726853261.25350: done getting the remaining hosts for this loop 11683 1726853261.25353: getting the next task for host managed_node3 11683 1726853261.25359: done getting next task for host managed_node3 11683 1726853261.25362: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11683 1726853261.25365: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853261.25380: getting variables 11683 1726853261.25382: in VariableManager get_vars() 11683 1726853261.25417: Calling all_inventory to load vars for managed_node3 11683 1726853261.25419: Calling groups_inventory to load vars for managed_node3 11683 1726853261.25422: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853261.25430: Calling all_plugins_play to load vars for managed_node3 11683 1726853261.25432: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853261.25434: Calling groups_plugins_play to load vars for managed_node3 11683 1726853261.26177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853261.27041: done with get_vars() 11683 1726853261.27059: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:27:41 -0400 (0:00:00.029) 0:00:14.343 ****** 11683 1726853261.27122: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11683 1726853261.27123: Creating lock for fedora.linux_system_roles.network_connections 11683 1726853261.27369: worker is 1 (out of 1 available) 11683 1726853261.27385: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11683 1726853261.27395: done queuing things up, now waiting for results queue to drain 11683 1726853261.27397: waiting for pending results... 11683 1726853261.27575: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11683 1726853261.27668: in run() - task 02083763-bbaf-c5b2-e075-000000000036 11683 1726853261.27681: variable 'ansible_search_path' from source: unknown 11683 1726853261.27685: variable 'ansible_search_path' from source: unknown 11683 1726853261.27710: calling self._execute() 11683 1726853261.27779: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853261.27783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853261.27793: variable 'omit' from source: magic vars 11683 1726853261.28065: variable 'ansible_distribution_major_version' from source: facts 11683 1726853261.28080: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853261.28084: variable 'omit' from source: magic vars 11683 1726853261.28121: variable 'omit' from source: magic vars 11683 1726853261.28234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853261.29924: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853261.29970: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853261.29998: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853261.30025: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853261.30045: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853261.30106: variable 'network_provider' from source: set_fact 11683 1726853261.30204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853261.30225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853261.30245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853261.30276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853261.30287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853261.30339: variable 'omit' from source: magic vars 11683 1726853261.30419: variable 'omit' from source: magic vars 11683 1726853261.30492: variable 'network_connections' from source: task vars 11683 1726853261.30501: variable 'controller_profile' from source: play vars 11683 1726853261.30543: variable 'controller_profile' from source: play vars 11683 1726853261.30552: variable 'controller_device' from source: play vars 11683 1726853261.30598: variable 'controller_device' from source: play vars 11683 1726853261.30605: variable 'port1_profile' from source: play vars 11683 1726853261.30647: variable 'port1_profile' from source: play vars 11683 1726853261.30655: variable 'dhcp_interface1' from source: play vars 11683 1726853261.30701: variable 'dhcp_interface1' from source: play vars 11683 1726853261.30706: variable 'controller_profile' from source: play vars 11683 1726853261.30746: variable 'controller_profile' from source: play vars 11683 1726853261.30755: variable 'port2_profile' from source: play vars 11683 1726853261.30812: variable 'port2_profile' from source: play vars 11683 1726853261.30818: variable 'dhcp_interface2' from source: play vars 11683 1726853261.30861: variable 'dhcp_interface2' from source: play vars 11683 1726853261.30866: variable 'controller_profile' from source: play vars 11683 1726853261.30911: variable 'controller_profile' from source: play vars 11683 1726853261.31032: variable 'omit' from source: magic vars 11683 1726853261.31038: variable '__lsr_ansible_managed' from source: task vars 11683 1726853261.31083: variable '__lsr_ansible_managed' from source: task vars 11683 1726853261.31210: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11683 1726853261.31352: Loaded config def from plugin (lookup/template) 11683 1726853261.31355: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11683 1726853261.31377: File lookup term: get_ansible_managed.j2 11683 1726853261.31380: variable 'ansible_search_path' from source: unknown 11683 1726853261.31383: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11683 1726853261.31394: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11683 1726853261.31407: variable 'ansible_search_path' from source: unknown 11683 1726853261.34565: variable 'ansible_managed' from source: unknown 11683 1726853261.34638: variable 'omit' from source: magic vars 11683 1726853261.34661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853261.34687: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853261.34698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853261.34711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853261.34719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853261.34741: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853261.34744: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853261.34750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853261.34816: Set connection var ansible_shell_executable to /bin/sh 11683 1726853261.34824: Set connection var ansible_timeout to 10 11683 1726853261.34830: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853261.34835: Set connection var ansible_pipelining to False 11683 1726853261.34846: Set connection var ansible_shell_type to sh 11683 1726853261.34848: Set connection var ansible_connection to ssh 11683 1726853261.34861: variable 'ansible_shell_executable' from source: unknown 11683 1726853261.34864: variable 'ansible_connection' from source: unknown 11683 1726853261.34867: variable 'ansible_module_compression' from source: unknown 11683 1726853261.34870: variable 'ansible_shell_type' from source: unknown 11683 1726853261.34874: variable 'ansible_shell_executable' from source: unknown 11683 1726853261.34876: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853261.34880: variable 'ansible_pipelining' from source: unknown 11683 1726853261.34882: variable 'ansible_timeout' from source: unknown 11683 1726853261.34884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853261.34975: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853261.34982: variable 'omit' from source: magic vars 11683 1726853261.34989: starting attempt loop 11683 1726853261.34991: running the handler 11683 1726853261.35003: _low_level_execute_command(): starting 11683 1726853261.35010: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853261.35520: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853261.35524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.35526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853261.35528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853261.35530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.35578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853261.35597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853261.35601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853261.35662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853261.37408: stdout chunk (state=3): >>>/root <<< 11683 1726853261.37504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853261.37535: stderr chunk (state=3): >>><<< 11683 1726853261.37538: stdout chunk (state=3): >>><<< 11683 1726853261.37559: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853261.37570: _low_level_execute_command(): starting 11683 1726853261.37576: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933 `" && echo ansible-tmp-1726853261.3755953-12295-119147697663933="` echo /root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933 `" ) && sleep 0' 11683 1726853261.38013: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853261.38017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853261.38019: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.38021: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853261.38023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.38077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853261.38082: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853261.38139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853261.40116: stdout chunk (state=3): >>>ansible-tmp-1726853261.3755953-12295-119147697663933=/root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933 <<< 11683 1726853261.40226: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853261.40250: stderr chunk (state=3): >>><<< 11683 1726853261.40255: stdout chunk (state=3): >>><<< 11683 1726853261.40266: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853261.3755953-12295-119147697663933=/root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853261.40305: variable 'ansible_module_compression' from source: unknown 11683 1726853261.40347: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11683 1726853261.40351: ANSIBALLZ: Acquiring lock 11683 1726853261.40353: ANSIBALLZ: Lock acquired: 139785059514032 11683 1726853261.40355: ANSIBALLZ: Creating module 11683 1726853261.53330: ANSIBALLZ: Writing module into payload 11683 1726853261.53553: ANSIBALLZ: Writing module 11683 1726853261.53574: ANSIBALLZ: Renaming module 11683 1726853261.53578: ANSIBALLZ: Done creating module 11683 1726853261.53600: variable 'ansible_facts' from source: unknown 11683 1726853261.53667: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/AnsiballZ_network_connections.py 11683 1726853261.53766: Sending initial data 11683 1726853261.53769: Sent initial data (168 bytes) 11683 1726853261.54234: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853261.54240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.54242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 11683 1726853261.54247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853261.54249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.54299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853261.54302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853261.54304: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853261.54378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853261.56055: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11683 1726853261.56059: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853261.56112: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853261.56170: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpbki93pc5 /root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/AnsiballZ_network_connections.py <<< 11683 1726853261.56178: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/AnsiballZ_network_connections.py" <<< 11683 1726853261.56229: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpbki93pc5" to remote "/root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/AnsiballZ_network_connections.py" <<< 11683 1726853261.56232: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/AnsiballZ_network_connections.py" <<< 11683 1726853261.57027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853261.57075: stderr chunk (state=3): >>><<< 11683 1726853261.57078: stdout chunk (state=3): >>><<< 11683 1726853261.57115: done transferring module to remote 11683 1726853261.57125: _low_level_execute_command(): starting 11683 1726853261.57130: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/ /root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/AnsiballZ_network_connections.py && sleep 0' 11683 1726853261.57585: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853261.57588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853261.57592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.57595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853261.57597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.57653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853261.57658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853261.57660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853261.57718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853261.59564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853261.59591: stderr chunk (state=3): >>><<< 11683 1726853261.59594: stdout chunk (state=3): >>><<< 11683 1726853261.59608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853261.59616: _low_level_execute_command(): starting 11683 1726853261.59618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/AnsiballZ_network_connections.py && sleep 0' 11683 1726853261.60080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853261.60083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853261.60085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.60087: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853261.60089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853261.60091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853261.60141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853261.60144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853261.60149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853261.60217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853262.02933: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11683 1726853262.05094: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853262.05123: stderr chunk (state=3): >>><<< 11683 1726853262.05126: stdout chunk (state=3): >>><<< 11683 1726853262.05142: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853262.05188: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853262.05197: _low_level_execute_command(): starting 11683 1726853262.05201: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853261.3755953-12295-119147697663933/ > /dev/null 2>&1 && sleep 0' 11683 1726853262.05660: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853262.05664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853262.05667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853262.05669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853262.05727: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853262.05732: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853262.05734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853262.05798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853262.07772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853262.07801: stderr chunk (state=3): >>><<< 11683 1726853262.07804: stdout chunk (state=3): >>><<< 11683 1726853262.07822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853262.07828: handler run complete 11683 1726853262.07854: attempt loop complete, returning result 11683 1726853262.07857: _execute() done 11683 1726853262.07860: dumping result to json 11683 1726853262.07866: done dumping result, returning 11683 1726853262.07876: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-c5b2-e075-000000000036] 11683 1726853262.07880: sending task result for task 02083763-bbaf-c5b2-e075-000000000036 11683 1726853262.07995: done sending task result for task 02083763-bbaf-c5b2-e075-000000000036 11683 1726853262.07997: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744 (not-active) 11683 1726853262.08132: no more pending results, returning what we have 11683 1726853262.08135: results queue empty 11683 1726853262.08136: checking for any_errors_fatal 11683 1726853262.08141: done checking for any_errors_fatal 11683 1726853262.08142: checking for max_fail_percentage 11683 1726853262.08147: done checking for max_fail_percentage 11683 1726853262.08148: checking to see if all hosts have failed and the running result is not ok 11683 1726853262.08149: done checking to see if all hosts have failed 11683 1726853262.08149: getting the remaining hosts for this loop 11683 1726853262.08151: done getting the remaining hosts for this loop 11683 1726853262.08154: getting the next task for host managed_node3 11683 1726853262.08159: done getting next task for host managed_node3 11683 1726853262.08162: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11683 1726853262.08165: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853262.08193: getting variables 11683 1726853262.08194: in VariableManager get_vars() 11683 1726853262.08239: Calling all_inventory to load vars for managed_node3 11683 1726853262.08242: Calling groups_inventory to load vars for managed_node3 11683 1726853262.08246: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853262.08255: Calling all_plugins_play to load vars for managed_node3 11683 1726853262.08257: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853262.08259: Calling groups_plugins_play to load vars for managed_node3 11683 1726853262.09823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853262.11520: done with get_vars() 11683 1726853262.11543: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:27:42 -0400 (0:00:00.845) 0:00:15.188 ****** 11683 1726853262.11640: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11683 1726853262.11642: Creating lock for fedora.linux_system_roles.network_state 11683 1726853262.12136: worker is 1 (out of 1 available) 11683 1726853262.12149: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11683 1726853262.12160: done queuing things up, now waiting for results queue to drain 11683 1726853262.12161: waiting for pending results... 11683 1726853262.12580: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11683 1726853262.12585: in run() - task 02083763-bbaf-c5b2-e075-000000000037 11683 1726853262.12587: variable 'ansible_search_path' from source: unknown 11683 1726853262.12590: variable 'ansible_search_path' from source: unknown 11683 1726853262.12598: calling self._execute() 11683 1726853262.12706: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.12719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.12735: variable 'omit' from source: magic vars 11683 1726853262.13023: variable 'ansible_distribution_major_version' from source: facts 11683 1726853262.13033: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853262.13118: variable 'network_state' from source: role '' defaults 11683 1726853262.13126: Evaluated conditional (network_state != {}): False 11683 1726853262.13129: when evaluation is False, skipping this task 11683 1726853262.13131: _execute() done 11683 1726853262.13134: dumping result to json 11683 1726853262.13137: done dumping result, returning 11683 1726853262.13144: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-c5b2-e075-000000000037] 11683 1726853262.13151: sending task result for task 02083763-bbaf-c5b2-e075-000000000037 11683 1726853262.13227: done sending task result for task 02083763-bbaf-c5b2-e075-000000000037 11683 1726853262.13230: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853262.13278: no more pending results, returning what we have 11683 1726853262.13282: results queue empty 11683 1726853262.13283: checking for any_errors_fatal 11683 1726853262.13294: done checking for any_errors_fatal 11683 1726853262.13295: checking for max_fail_percentage 11683 1726853262.13297: done checking for max_fail_percentage 11683 1726853262.13298: checking to see if all hosts have failed and the running result is not ok 11683 1726853262.13299: done checking to see if all hosts have failed 11683 1726853262.13299: getting the remaining hosts for this loop 11683 1726853262.13301: done getting the remaining hosts for this loop 11683 1726853262.13304: getting the next task for host managed_node3 11683 1726853262.13309: done getting next task for host managed_node3 11683 1726853262.13313: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11683 1726853262.13315: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853262.13328: getting variables 11683 1726853262.13330: in VariableManager get_vars() 11683 1726853262.13366: Calling all_inventory to load vars for managed_node3 11683 1726853262.13368: Calling groups_inventory to load vars for managed_node3 11683 1726853262.13370: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853262.13380: Calling all_plugins_play to load vars for managed_node3 11683 1726853262.13382: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853262.13384: Calling groups_plugins_play to load vars for managed_node3 11683 1726853262.14112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853262.14962: done with get_vars() 11683 1726853262.14978: done getting variables 11683 1726853262.15017: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:27:42 -0400 (0:00:00.034) 0:00:15.222 ****** 11683 1726853262.15041: entering _queue_task() for managed_node3/debug 11683 1726853262.15244: worker is 1 (out of 1 available) 11683 1726853262.15256: exiting _queue_task() for managed_node3/debug 11683 1726853262.15268: done queuing things up, now waiting for results queue to drain 11683 1726853262.15269: waiting for pending results... 11683 1726853262.15439: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11683 1726853262.15517: in run() - task 02083763-bbaf-c5b2-e075-000000000038 11683 1726853262.15529: variable 'ansible_search_path' from source: unknown 11683 1726853262.15533: variable 'ansible_search_path' from source: unknown 11683 1726853262.15561: calling self._execute() 11683 1726853262.15628: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.15632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.15641: variable 'omit' from source: magic vars 11683 1726853262.15900: variable 'ansible_distribution_major_version' from source: facts 11683 1726853262.15909: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853262.15915: variable 'omit' from source: magic vars 11683 1726853262.15960: variable 'omit' from source: magic vars 11683 1726853262.15985: variable 'omit' from source: magic vars 11683 1726853262.16014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853262.16044: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853262.16059: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853262.16074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853262.16084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853262.16106: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853262.16109: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.16111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.16183: Set connection var ansible_shell_executable to /bin/sh 11683 1726853262.16191: Set connection var ansible_timeout to 10 11683 1726853262.16198: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853262.16202: Set connection var ansible_pipelining to False 11683 1726853262.16205: Set connection var ansible_shell_type to sh 11683 1726853262.16207: Set connection var ansible_connection to ssh 11683 1726853262.16224: variable 'ansible_shell_executable' from source: unknown 11683 1726853262.16227: variable 'ansible_connection' from source: unknown 11683 1726853262.16230: variable 'ansible_module_compression' from source: unknown 11683 1726853262.16232: variable 'ansible_shell_type' from source: unknown 11683 1726853262.16234: variable 'ansible_shell_executable' from source: unknown 11683 1726853262.16236: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.16239: variable 'ansible_pipelining' from source: unknown 11683 1726853262.16243: variable 'ansible_timeout' from source: unknown 11683 1726853262.16249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.16346: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853262.16357: variable 'omit' from source: magic vars 11683 1726853262.16362: starting attempt loop 11683 1726853262.16364: running the handler 11683 1726853262.16453: variable '__network_connections_result' from source: set_fact 11683 1726853262.16503: handler run complete 11683 1726853262.16515: attempt loop complete, returning result 11683 1726853262.16518: _execute() done 11683 1726853262.16521: dumping result to json 11683 1726853262.16523: done dumping result, returning 11683 1726853262.16532: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-c5b2-e075-000000000038] 11683 1726853262.16538: sending task result for task 02083763-bbaf-c5b2-e075-000000000038 11683 1726853262.16619: done sending task result for task 02083763-bbaf-c5b2-e075-000000000038 11683 1726853262.16623: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744 (not-active)" ] } 11683 1726853262.16684: no more pending results, returning what we have 11683 1726853262.16687: results queue empty 11683 1726853262.16688: checking for any_errors_fatal 11683 1726853262.16692: done checking for any_errors_fatal 11683 1726853262.16693: checking for max_fail_percentage 11683 1726853262.16695: done checking for max_fail_percentage 11683 1726853262.16695: checking to see if all hosts have failed and the running result is not ok 11683 1726853262.16696: done checking to see if all hosts have failed 11683 1726853262.16697: getting the remaining hosts for this loop 11683 1726853262.16698: done getting the remaining hosts for this loop 11683 1726853262.16701: getting the next task for host managed_node3 11683 1726853262.16706: done getting next task for host managed_node3 11683 1726853262.16709: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11683 1726853262.16712: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853262.16721: getting variables 11683 1726853262.16722: in VariableManager get_vars() 11683 1726853262.16752: Calling all_inventory to load vars for managed_node3 11683 1726853262.16755: Calling groups_inventory to load vars for managed_node3 11683 1726853262.16757: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853262.16765: Calling all_plugins_play to load vars for managed_node3 11683 1726853262.16767: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853262.16769: Calling groups_plugins_play to load vars for managed_node3 11683 1726853262.17573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853262.18421: done with get_vars() 11683 1726853262.18435: done getting variables 11683 1726853262.18479: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:27:42 -0400 (0:00:00.034) 0:00:15.257 ****** 11683 1726853262.18505: entering _queue_task() for managed_node3/debug 11683 1726853262.18706: worker is 1 (out of 1 available) 11683 1726853262.18717: exiting _queue_task() for managed_node3/debug 11683 1726853262.18728: done queuing things up, now waiting for results queue to drain 11683 1726853262.18729: waiting for pending results... 11683 1726853262.18904: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11683 1726853262.18985: in run() - task 02083763-bbaf-c5b2-e075-000000000039 11683 1726853262.18996: variable 'ansible_search_path' from source: unknown 11683 1726853262.19001: variable 'ansible_search_path' from source: unknown 11683 1726853262.19026: calling self._execute() 11683 1726853262.19091: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.19096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.19104: variable 'omit' from source: magic vars 11683 1726853262.19361: variable 'ansible_distribution_major_version' from source: facts 11683 1726853262.19369: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853262.19377: variable 'omit' from source: magic vars 11683 1726853262.19415: variable 'omit' from source: magic vars 11683 1726853262.19440: variable 'omit' from source: magic vars 11683 1726853262.19473: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853262.19500: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853262.19517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853262.19530: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853262.19539: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853262.19565: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853262.19567: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.19570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.19638: Set connection var ansible_shell_executable to /bin/sh 11683 1726853262.19646: Set connection var ansible_timeout to 10 11683 1726853262.19654: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853262.19659: Set connection var ansible_pipelining to False 11683 1726853262.19661: Set connection var ansible_shell_type to sh 11683 1726853262.19663: Set connection var ansible_connection to ssh 11683 1726853262.19681: variable 'ansible_shell_executable' from source: unknown 11683 1726853262.19684: variable 'ansible_connection' from source: unknown 11683 1726853262.19687: variable 'ansible_module_compression' from source: unknown 11683 1726853262.19689: variable 'ansible_shell_type' from source: unknown 11683 1726853262.19691: variable 'ansible_shell_executable' from source: unknown 11683 1726853262.19693: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.19696: variable 'ansible_pipelining' from source: unknown 11683 1726853262.19700: variable 'ansible_timeout' from source: unknown 11683 1726853262.19703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.19801: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853262.19809: variable 'omit' from source: magic vars 11683 1726853262.19814: starting attempt loop 11683 1726853262.19817: running the handler 11683 1726853262.19860: variable '__network_connections_result' from source: set_fact 11683 1726853262.19913: variable '__network_connections_result' from source: set_fact 11683 1726853262.20026: handler run complete 11683 1726853262.20047: attempt loop complete, returning result 11683 1726853262.20050: _execute() done 11683 1726853262.20052: dumping result to json 11683 1726853262.20062: done dumping result, returning 11683 1726853262.20065: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-c5b2-e075-000000000039] 11683 1726853262.20073: sending task result for task 02083763-bbaf-c5b2-e075-000000000039 11683 1726853262.20157: done sending task result for task 02083763-bbaf-c5b2-e075-000000000039 11683 1726853262.20160: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 3da92b17-45a3-429c-9e42-16f5e5b46354 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 014f8cbf-bba4-4157-aa64-400d4d1c3b6d (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 3f4378cb-8ba1-4df0-ad5d-f4be4454b744 (not-active)" ] } } 11683 1726853262.20256: no more pending results, returning what we have 11683 1726853262.20259: results queue empty 11683 1726853262.20265: checking for any_errors_fatal 11683 1726853262.20272: done checking for any_errors_fatal 11683 1726853262.20273: checking for max_fail_percentage 11683 1726853262.20274: done checking for max_fail_percentage 11683 1726853262.20275: checking to see if all hosts have failed and the running result is not ok 11683 1726853262.20276: done checking to see if all hosts have failed 11683 1726853262.20276: getting the remaining hosts for this loop 11683 1726853262.20277: done getting the remaining hosts for this loop 11683 1726853262.20280: getting the next task for host managed_node3 11683 1726853262.20285: done getting next task for host managed_node3 11683 1726853262.20288: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11683 1726853262.20291: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853262.20299: getting variables 11683 1726853262.20301: in VariableManager get_vars() 11683 1726853262.20331: Calling all_inventory to load vars for managed_node3 11683 1726853262.20334: Calling groups_inventory to load vars for managed_node3 11683 1726853262.20336: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853262.20343: Calling all_plugins_play to load vars for managed_node3 11683 1726853262.20345: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853262.20348: Calling groups_plugins_play to load vars for managed_node3 11683 1726853262.21051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853262.21980: done with get_vars() 11683 1726853262.21994: done getting variables 11683 1726853262.22031: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:27:42 -0400 (0:00:00.035) 0:00:15.292 ****** 11683 1726853262.22053: entering _queue_task() for managed_node3/debug 11683 1726853262.22259: worker is 1 (out of 1 available) 11683 1726853262.22274: exiting _queue_task() for managed_node3/debug 11683 1726853262.22284: done queuing things up, now waiting for results queue to drain 11683 1726853262.22285: waiting for pending results... 11683 1726853262.22456: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11683 1726853262.22540: in run() - task 02083763-bbaf-c5b2-e075-00000000003a 11683 1726853262.22554: variable 'ansible_search_path' from source: unknown 11683 1726853262.22558: variable 'ansible_search_path' from source: unknown 11683 1726853262.22587: calling self._execute() 11683 1726853262.22656: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.22661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.22669: variable 'omit' from source: magic vars 11683 1726853262.22927: variable 'ansible_distribution_major_version' from source: facts 11683 1726853262.22936: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853262.23024: variable 'network_state' from source: role '' defaults 11683 1726853262.23032: Evaluated conditional (network_state != {}): False 11683 1726853262.23035: when evaluation is False, skipping this task 11683 1726853262.23038: _execute() done 11683 1726853262.23041: dumping result to json 11683 1726853262.23043: done dumping result, returning 11683 1726853262.23054: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-c5b2-e075-00000000003a] 11683 1726853262.23059: sending task result for task 02083763-bbaf-c5b2-e075-00000000003a 11683 1726853262.23137: done sending task result for task 02083763-bbaf-c5b2-e075-00000000003a 11683 1726853262.23140: WORKER PROCESS EXITING skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11683 1726853262.23201: no more pending results, returning what we have 11683 1726853262.23205: results queue empty 11683 1726853262.23206: checking for any_errors_fatal 11683 1726853262.23213: done checking for any_errors_fatal 11683 1726853262.23214: checking for max_fail_percentage 11683 1726853262.23215: done checking for max_fail_percentage 11683 1726853262.23216: checking to see if all hosts have failed and the running result is not ok 11683 1726853262.23217: done checking to see if all hosts have failed 11683 1726853262.23217: getting the remaining hosts for this loop 11683 1726853262.23219: done getting the remaining hosts for this loop 11683 1726853262.23222: getting the next task for host managed_node3 11683 1726853262.23227: done getting next task for host managed_node3 11683 1726853262.23231: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11683 1726853262.23233: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853262.23246: getting variables 11683 1726853262.23247: in VariableManager get_vars() 11683 1726853262.23280: Calling all_inventory to load vars for managed_node3 11683 1726853262.23283: Calling groups_inventory to load vars for managed_node3 11683 1726853262.23285: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853262.23293: Calling all_plugins_play to load vars for managed_node3 11683 1726853262.23295: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853262.23297: Calling groups_plugins_play to load vars for managed_node3 11683 1726853262.24010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853262.24865: done with get_vars() 11683 1726853262.24882: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:27:42 -0400 (0:00:00.028) 0:00:15.321 ****** 11683 1726853262.24950: entering _queue_task() for managed_node3/ping 11683 1726853262.24952: Creating lock for ping 11683 1726853262.25187: worker is 1 (out of 1 available) 11683 1726853262.25200: exiting _queue_task() for managed_node3/ping 11683 1726853262.25211: done queuing things up, now waiting for results queue to drain 11683 1726853262.25212: waiting for pending results... 11683 1726853262.25390: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11683 1726853262.25478: in run() - task 02083763-bbaf-c5b2-e075-00000000003b 11683 1726853262.25490: variable 'ansible_search_path' from source: unknown 11683 1726853262.25494: variable 'ansible_search_path' from source: unknown 11683 1726853262.25520: calling self._execute() 11683 1726853262.25589: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.25593: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.25602: variable 'omit' from source: magic vars 11683 1726853262.25864: variable 'ansible_distribution_major_version' from source: facts 11683 1726853262.25880: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853262.25883: variable 'omit' from source: magic vars 11683 1726853262.25921: variable 'omit' from source: magic vars 11683 1726853262.25944: variable 'omit' from source: magic vars 11683 1726853262.25980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853262.26007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853262.26023: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853262.26036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853262.26046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853262.26073: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853262.26076: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.26079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.26146: Set connection var ansible_shell_executable to /bin/sh 11683 1726853262.26156: Set connection var ansible_timeout to 10 11683 1726853262.26162: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853262.26167: Set connection var ansible_pipelining to False 11683 1726853262.26170: Set connection var ansible_shell_type to sh 11683 1726853262.26174: Set connection var ansible_connection to ssh 11683 1726853262.26190: variable 'ansible_shell_executable' from source: unknown 11683 1726853262.26193: variable 'ansible_connection' from source: unknown 11683 1726853262.26196: variable 'ansible_module_compression' from source: unknown 11683 1726853262.26200: variable 'ansible_shell_type' from source: unknown 11683 1726853262.26202: variable 'ansible_shell_executable' from source: unknown 11683 1726853262.26204: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.26206: variable 'ansible_pipelining' from source: unknown 11683 1726853262.26209: variable 'ansible_timeout' from source: unknown 11683 1726853262.26218: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.26359: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853262.26367: variable 'omit' from source: magic vars 11683 1726853262.26372: starting attempt loop 11683 1726853262.26376: running the handler 11683 1726853262.26387: _low_level_execute_command(): starting 11683 1726853262.26393: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853262.26905: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853262.26909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11683 1726853262.26913: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853262.26960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853262.26963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853262.27041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853262.28754: stdout chunk (state=3): >>>/root <<< 11683 1726853262.28855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853262.28886: stderr chunk (state=3): >>><<< 11683 1726853262.28889: stdout chunk (state=3): >>><<< 11683 1726853262.28909: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853262.28920: _low_level_execute_command(): starting 11683 1726853262.28927: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985 `" && echo ansible-tmp-1726853262.2890964-12322-168128047284985="` echo /root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985 `" ) && sleep 0' 11683 1726853262.29407: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853262.29466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853262.29481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853262.29605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853262.29609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853262.29696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853262.31687: stdout chunk (state=3): >>>ansible-tmp-1726853262.2890964-12322-168128047284985=/root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985 <<< 11683 1726853262.31839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853262.31843: stdout chunk (state=3): >>><<< 11683 1726853262.31848: stderr chunk (state=3): >>><<< 11683 1726853262.31964: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853262.2890964-12322-168128047284985=/root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853262.31968: variable 'ansible_module_compression' from source: unknown 11683 1726853262.31978: ANSIBALLZ: Using lock for ping 11683 1726853262.31986: ANSIBALLZ: Acquiring lock 11683 1726853262.31993: ANSIBALLZ: Lock acquired: 139785059121536 11683 1726853262.32000: ANSIBALLZ: Creating module 11683 1726853262.42763: ANSIBALLZ: Writing module into payload 11683 1726853262.42831: ANSIBALLZ: Writing module 11683 1726853262.42857: ANSIBALLZ: Renaming module 11683 1726853262.42866: ANSIBALLZ: Done creating module 11683 1726853262.42889: variable 'ansible_facts' from source: unknown 11683 1726853262.42955: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/AnsiballZ_ping.py 11683 1726853262.43191: Sending initial data 11683 1726853262.43194: Sent initial data (153 bytes) 11683 1726853262.43687: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853262.43700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853262.43756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853262.43851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853262.43854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853262.43911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853262.45638: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853262.45691: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853262.45768: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpf105yata /root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/AnsiballZ_ping.py <<< 11683 1726853262.45772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/AnsiballZ_ping.py" <<< 11683 1726853262.45820: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpf105yata" to remote "/root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/AnsiballZ_ping.py" <<< 11683 1726853262.46646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853262.46779: stderr chunk (state=3): >>><<< 11683 1726853262.46783: stdout chunk (state=3): >>><<< 11683 1726853262.46785: done transferring module to remote 11683 1726853262.46788: _low_level_execute_command(): starting 11683 1726853262.46790: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/ /root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/AnsiballZ_ping.py && sleep 0' 11683 1726853262.47455: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853262.47473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853262.47554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853262.47602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853262.47632: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853262.47673: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853262.47759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853262.49791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853262.49798: stderr chunk (state=3): >>><<< 11683 1726853262.49808: stdout chunk (state=3): >>><<< 11683 1726853262.49878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853262.49881: _low_level_execute_command(): starting 11683 1726853262.49884: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/AnsiballZ_ping.py && sleep 0' 11683 1726853262.51032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853262.51041: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853262.51052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853262.51067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853262.51081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853262.51090: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853262.51100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853262.51114: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853262.51189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853262.51192: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11683 1726853262.51194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853262.51196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853262.51198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853262.51200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853262.51202: stderr chunk (state=3): >>>debug2: match found <<< 11683 1726853262.51204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853262.51282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853262.51295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853262.51426: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853262.51554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853262.67378: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11683 1726853262.68566: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853262.68720: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 11683 1726853262.68724: stdout chunk (state=3): >>><<< 11683 1726853262.68730: stderr chunk (state=3): >>><<< 11683 1726853262.68750: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853262.68770: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853262.68783: _low_level_execute_command(): starting 11683 1726853262.68788: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853262.2890964-12322-168128047284985/ > /dev/null 2>&1 && sleep 0' 11683 1726853262.70081: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853262.70188: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853262.70202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853262.70231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853262.70370: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853262.70605: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853262.70666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853262.72625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853262.72705: stderr chunk (state=3): >>><<< 11683 1726853262.72787: stdout chunk (state=3): >>><<< 11683 1726853262.72815: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853262.72827: handler run complete 11683 1726853262.72852: attempt loop complete, returning result 11683 1726853262.72883: _execute() done 11683 1726853262.72891: dumping result to json 11683 1726853262.72917: done dumping result, returning 11683 1726853262.72932: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-c5b2-e075-00000000003b] 11683 1726853262.73180: sending task result for task 02083763-bbaf-c5b2-e075-00000000003b 11683 1726853262.73247: done sending task result for task 02083763-bbaf-c5b2-e075-00000000003b 11683 1726853262.73251: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 11683 1726853262.73341: no more pending results, returning what we have 11683 1726853262.73347: results queue empty 11683 1726853262.73348: checking for any_errors_fatal 11683 1726853262.73355: done checking for any_errors_fatal 11683 1726853262.73356: checking for max_fail_percentage 11683 1726853262.73358: done checking for max_fail_percentage 11683 1726853262.73358: checking to see if all hosts have failed and the running result is not ok 11683 1726853262.73360: done checking to see if all hosts have failed 11683 1726853262.73360: getting the remaining hosts for this loop 11683 1726853262.73362: done getting the remaining hosts for this loop 11683 1726853262.73366: getting the next task for host managed_node3 11683 1726853262.73378: done getting next task for host managed_node3 11683 1726853262.73380: ^ task is: TASK: meta (role_complete) 11683 1726853262.73383: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853262.73395: getting variables 11683 1726853262.73397: in VariableManager get_vars() 11683 1726853262.73443: Calling all_inventory to load vars for managed_node3 11683 1726853262.73448: Calling groups_inventory to load vars for managed_node3 11683 1726853262.73451: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853262.73463: Calling all_plugins_play to load vars for managed_node3 11683 1726853262.73466: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853262.73470: Calling groups_plugins_play to load vars for managed_node3 11683 1726853262.76952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853262.79919: done with get_vars() 11683 1726853262.79956: done getting variables 11683 1726853262.80050: done queuing things up, now waiting for results queue to drain 11683 1726853262.80052: results queue empty 11683 1726853262.80053: checking for any_errors_fatal 11683 1726853262.80056: done checking for any_errors_fatal 11683 1726853262.80057: checking for max_fail_percentage 11683 1726853262.80058: done checking for max_fail_percentage 11683 1726853262.80059: checking to see if all hosts have failed and the running result is not ok 11683 1726853262.80060: done checking to see if all hosts have failed 11683 1726853262.80060: getting the remaining hosts for this loop 11683 1726853262.80061: done getting the remaining hosts for this loop 11683 1726853262.80064: getting the next task for host managed_node3 11683 1726853262.80069: done getting next task for host managed_node3 11683 1726853262.80074: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11683 1726853262.80076: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853262.80078: getting variables 11683 1726853262.80079: in VariableManager get_vars() 11683 1726853262.80096: Calling all_inventory to load vars for managed_node3 11683 1726853262.80100: Calling groups_inventory to load vars for managed_node3 11683 1726853262.80102: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853262.80107: Calling all_plugins_play to load vars for managed_node3 11683 1726853262.80110: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853262.80112: Calling groups_plugins_play to load vars for managed_node3 11683 1726853262.81311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853262.82910: done with get_vars() 11683 1726853262.82939: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 13:27:42 -0400 (0:00:00.580) 0:00:15.902 ****** 11683 1726853262.83035: entering _queue_task() for managed_node3/include_tasks 11683 1726853262.83410: worker is 1 (out of 1 available) 11683 1726853262.83423: exiting _queue_task() for managed_node3/include_tasks 11683 1726853262.83435: done queuing things up, now waiting for results queue to drain 11683 1726853262.83437: waiting for pending results... 11683 1726853262.83733: running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' 11683 1726853262.83873: in run() - task 02083763-bbaf-c5b2-e075-00000000006e 11683 1726853262.83903: variable 'ansible_search_path' from source: unknown 11683 1726853262.83912: variable 'ansible_search_path' from source: unknown 11683 1726853262.83954: calling self._execute() 11683 1726853262.84076: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.84080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.84083: variable 'omit' from source: magic vars 11683 1726853262.84477: variable 'ansible_distribution_major_version' from source: facts 11683 1726853262.84495: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853262.84553: _execute() done 11683 1726853262.84557: dumping result to json 11683 1726853262.84560: done dumping result, returning 11683 1726853262.84563: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_interface_stat.yml' [02083763-bbaf-c5b2-e075-00000000006e] 11683 1726853262.84565: sending task result for task 02083763-bbaf-c5b2-e075-00000000006e 11683 1726853262.84687: no more pending results, returning what we have 11683 1726853262.84692: in VariableManager get_vars() 11683 1726853262.84742: Calling all_inventory to load vars for managed_node3 11683 1726853262.84748: Calling groups_inventory to load vars for managed_node3 11683 1726853262.84750: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853262.84766: Calling all_plugins_play to load vars for managed_node3 11683 1726853262.84769: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853262.84774: Calling groups_plugins_play to load vars for managed_node3 11683 1726853262.85487: done sending task result for task 02083763-bbaf-c5b2-e075-00000000006e 11683 1726853262.85490: WORKER PROCESS EXITING 11683 1726853262.86539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853262.88128: done with get_vars() 11683 1726853262.88157: variable 'ansible_search_path' from source: unknown 11683 1726853262.88158: variable 'ansible_search_path' from source: unknown 11683 1726853262.88199: we have included files to process 11683 1726853262.88200: generating all_blocks data 11683 1726853262.88202: done generating all_blocks data 11683 1726853262.88207: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11683 1726853262.88209: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11683 1726853262.88211: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11683 1726853262.88413: done processing included file 11683 1726853262.88416: iterating over new_blocks loaded from include file 11683 1726853262.88417: in VariableManager get_vars() 11683 1726853262.88437: done with get_vars() 11683 1726853262.88439: filtering new block on tags 11683 1726853262.88463: done filtering new block on tags 11683 1726853262.88466: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node3 11683 1726853262.88473: extending task lists for all hosts with included blocks 11683 1726853262.88583: done extending task lists 11683 1726853262.88584: done processing included files 11683 1726853262.88585: results queue empty 11683 1726853262.88586: checking for any_errors_fatal 11683 1726853262.88587: done checking for any_errors_fatal 11683 1726853262.88588: checking for max_fail_percentage 11683 1726853262.88589: done checking for max_fail_percentage 11683 1726853262.88590: checking to see if all hosts have failed and the running result is not ok 11683 1726853262.88590: done checking to see if all hosts have failed 11683 1726853262.88591: getting the remaining hosts for this loop 11683 1726853262.88592: done getting the remaining hosts for this loop 11683 1726853262.88595: getting the next task for host managed_node3 11683 1726853262.88599: done getting next task for host managed_node3 11683 1726853262.88601: ^ task is: TASK: Get stat for interface {{ interface }} 11683 1726853262.88604: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853262.88606: getting variables 11683 1726853262.88607: in VariableManager get_vars() 11683 1726853262.88622: Calling all_inventory to load vars for managed_node3 11683 1726853262.88625: Calling groups_inventory to load vars for managed_node3 11683 1726853262.88626: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853262.88632: Calling all_plugins_play to load vars for managed_node3 11683 1726853262.88634: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853262.88637: Calling groups_plugins_play to load vars for managed_node3 11683 1726853262.89834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853262.91466: done with get_vars() 11683 1726853262.91494: done getting variables 11683 1726853262.91661: variable 'interface' from source: task vars 11683 1726853262.91665: variable 'controller_device' from source: play vars 11683 1726853262.91727: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 13:27:42 -0400 (0:00:00.087) 0:00:15.989 ****** 11683 1726853262.91762: entering _queue_task() for managed_node3/stat 11683 1726853262.92120: worker is 1 (out of 1 available) 11683 1726853262.92132: exiting _queue_task() for managed_node3/stat 11683 1726853262.92260: done queuing things up, now waiting for results queue to drain 11683 1726853262.92262: waiting for pending results... 11683 1726853262.92443: running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond 11683 1726853262.92601: in run() - task 02083763-bbaf-c5b2-e075-000000000241 11683 1726853262.92622: variable 'ansible_search_path' from source: unknown 11683 1726853262.92630: variable 'ansible_search_path' from source: unknown 11683 1726853262.92676: calling self._execute() 11683 1726853262.92769: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.92782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.92794: variable 'omit' from source: magic vars 11683 1726853262.93447: variable 'ansible_distribution_major_version' from source: facts 11683 1726853262.93576: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853262.93579: variable 'omit' from source: magic vars 11683 1726853262.93616: variable 'omit' from source: magic vars 11683 1726853262.93869: variable 'interface' from source: task vars 11683 1726853262.93938: variable 'controller_device' from source: play vars 11683 1726853262.93984: variable 'controller_device' from source: play vars 11683 1726853262.94072: variable 'omit' from source: magic vars 11683 1726853262.94117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853262.94192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853262.94374: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853262.94378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853262.94380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853262.94382: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853262.94384: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.94386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.94561: Set connection var ansible_shell_executable to /bin/sh 11683 1726853262.94802: Set connection var ansible_timeout to 10 11683 1726853262.94805: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853262.94807: Set connection var ansible_pipelining to False 11683 1726853262.94809: Set connection var ansible_shell_type to sh 11683 1726853262.94811: Set connection var ansible_connection to ssh 11683 1726853262.94813: variable 'ansible_shell_executable' from source: unknown 11683 1726853262.94815: variable 'ansible_connection' from source: unknown 11683 1726853262.94816: variable 'ansible_module_compression' from source: unknown 11683 1726853262.94818: variable 'ansible_shell_type' from source: unknown 11683 1726853262.94820: variable 'ansible_shell_executable' from source: unknown 11683 1726853262.94821: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853262.94823: variable 'ansible_pipelining' from source: unknown 11683 1726853262.94826: variable 'ansible_timeout' from source: unknown 11683 1726853262.94828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853262.95199: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853262.95251: variable 'omit' from source: magic vars 11683 1726853262.95263: starting attempt loop 11683 1726853262.95269: running the handler 11683 1726853262.95291: _low_level_execute_command(): starting 11683 1726853262.95360: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853262.96649: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853262.96653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11683 1726853262.96657: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853262.96672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853262.96906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853262.96992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853262.97019: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853262.98760: stdout chunk (state=3): >>>/root <<< 11683 1726853262.98888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853262.98935: stderr chunk (state=3): >>><<< 11683 1726853262.98948: stdout chunk (state=3): >>><<< 11683 1726853262.99082: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853262.99097: _low_level_execute_command(): starting 11683 1726853262.99104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675 `" && echo ansible-tmp-1726853262.9908237-12341-144786332101675="` echo /root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675 `" ) && sleep 0' 11683 1726853263.00536: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853263.00540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.00554: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 11683 1726853263.00557: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.00604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853263.00656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853263.00659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.00758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853263.02796: stdout chunk (state=3): >>>ansible-tmp-1726853262.9908237-12341-144786332101675=/root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675 <<< 11683 1726853263.02982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853263.02986: stdout chunk (state=3): >>><<< 11683 1726853263.02993: stderr chunk (state=3): >>><<< 11683 1726853263.03012: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853262.9908237-12341-144786332101675=/root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853263.03064: variable 'ansible_module_compression' from source: unknown 11683 1726853263.03155: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11683 1726853263.03158: variable 'ansible_facts' from source: unknown 11683 1726853263.03448: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/AnsiballZ_stat.py 11683 1726853263.03923: Sending initial data 11683 1726853263.03926: Sent initial data (153 bytes) 11683 1726853263.04906: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853263.05129: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.05154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853263.05272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853263.05292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.05381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853263.07061: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853263.07114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853263.07172: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpin975f8c /root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/AnsiballZ_stat.py <<< 11683 1726853263.07176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/AnsiballZ_stat.py" <<< 11683 1726853263.07251: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpin975f8c" to remote "/root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/AnsiballZ_stat.py" <<< 11683 1726853263.08490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853263.08558: stderr chunk (state=3): >>><<< 11683 1726853263.08561: stdout chunk (state=3): >>><<< 11683 1726853263.08602: done transferring module to remote 11683 1726853263.08613: _low_level_execute_command(): starting 11683 1726853263.08618: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/ /root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/AnsiballZ_stat.py && sleep 0' 11683 1726853263.09214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853263.09275: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853263.09282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853263.09298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853263.09304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.09349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.09408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853263.09460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853263.09463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.09523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853263.11444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853263.11448: stdout chunk (state=3): >>><<< 11683 1726853263.11451: stderr chunk (state=3): >>><<< 11683 1726853263.11520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853263.11523: _low_level_execute_command(): starting 11683 1726853263.11526: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/AnsiballZ_stat.py && sleep 0' 11683 1726853263.12503: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853263.12800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853263.13029: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.13111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853263.28731: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28236, "dev": 23, "nlink": 1, "atime": 1726853261.8733222, "mtime": 1726853261.8733222, "ctime": 1726853261.8733222, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11683 1726853263.30127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853263.30194: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 11683 1726853263.30217: stdout chunk (state=3): >>><<< 11683 1726853263.30232: stderr chunk (state=3): >>><<< 11683 1726853263.30263: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 28236, "dev": 23, "nlink": 1, "atime": 1726853261.8733222, "mtime": 1726853261.8733222, "ctime": 1726853261.8733222, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853263.30380: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853263.30383: _low_level_execute_command(): starting 11683 1726853263.30386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853262.9908237-12341-144786332101675/ > /dev/null 2>&1 && sleep 0' 11683 1726853263.31154: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853263.31198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.31236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853263.31276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.31329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853263.33376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853263.33380: stderr chunk (state=3): >>><<< 11683 1726853263.33382: stdout chunk (state=3): >>><<< 11683 1726853263.33385: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853263.33387: handler run complete 11683 1726853263.33390: attempt loop complete, returning result 11683 1726853263.33392: _execute() done 11683 1726853263.33394: dumping result to json 11683 1726853263.33400: done dumping result, returning 11683 1726853263.33411: done running TaskExecutor() for managed_node3/TASK: Get stat for interface nm-bond [02083763-bbaf-c5b2-e075-000000000241] 11683 1726853263.33416: sending task result for task 02083763-bbaf-c5b2-e075-000000000241 11683 1726853263.33677: done sending task result for task 02083763-bbaf-c5b2-e075-000000000241 11683 1726853263.33680: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "atime": 1726853261.8733222, "block_size": 4096, "blocks": 0, "ctime": 1726853261.8733222, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 28236, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726853261.8733222, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11683 1726853263.33788: no more pending results, returning what we have 11683 1726853263.33791: results queue empty 11683 1726853263.33792: checking for any_errors_fatal 11683 1726853263.33793: done checking for any_errors_fatal 11683 1726853263.33794: checking for max_fail_percentage 11683 1726853263.33795: done checking for max_fail_percentage 11683 1726853263.33796: checking to see if all hosts have failed and the running result is not ok 11683 1726853263.33797: done checking to see if all hosts have failed 11683 1726853263.33797: getting the remaining hosts for this loop 11683 1726853263.33799: done getting the remaining hosts for this loop 11683 1726853263.33802: getting the next task for host managed_node3 11683 1726853263.33809: done getting next task for host managed_node3 11683 1726853263.33812: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11683 1726853263.33815: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853263.33819: getting variables 11683 1726853263.33821: in VariableManager get_vars() 11683 1726853263.33860: Calling all_inventory to load vars for managed_node3 11683 1726853263.33863: Calling groups_inventory to load vars for managed_node3 11683 1726853263.33866: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853263.33878: Calling all_plugins_play to load vars for managed_node3 11683 1726853263.33881: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853263.33884: Calling groups_plugins_play to load vars for managed_node3 11683 1726853263.35417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853263.37038: done with get_vars() 11683 1726853263.37063: done getting variables 11683 1726853263.37126: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853263.37246: variable 'interface' from source: task vars 11683 1726853263.37250: variable 'controller_device' from source: play vars 11683 1726853263.37313: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 13:27:43 -0400 (0:00:00.455) 0:00:16.445 ****** 11683 1726853263.37344: entering _queue_task() for managed_node3/assert 11683 1726853263.37681: worker is 1 (out of 1 available) 11683 1726853263.37695: exiting _queue_task() for managed_node3/assert 11683 1726853263.37705: done queuing things up, now waiting for results queue to drain 11683 1726853263.37707: waiting for pending results... 11683 1726853263.37994: running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' 11683 1726853263.38098: in run() - task 02083763-bbaf-c5b2-e075-00000000006f 11683 1726853263.38119: variable 'ansible_search_path' from source: unknown 11683 1726853263.38126: variable 'ansible_search_path' from source: unknown 11683 1726853263.38205: calling self._execute() 11683 1726853263.38269: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.38284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.38300: variable 'omit' from source: magic vars 11683 1726853263.38693: variable 'ansible_distribution_major_version' from source: facts 11683 1726853263.38711: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853263.38750: variable 'omit' from source: magic vars 11683 1726853263.38791: variable 'omit' from source: magic vars 11683 1726853263.38898: variable 'interface' from source: task vars 11683 1726853263.38908: variable 'controller_device' from source: play vars 11683 1726853263.38982: variable 'controller_device' from source: play vars 11683 1726853263.39078: variable 'omit' from source: magic vars 11683 1726853263.39082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853263.39100: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853263.39126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853263.39150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853263.39167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853263.39212: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853263.39221: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.39276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.39344: Set connection var ansible_shell_executable to /bin/sh 11683 1726853263.39361: Set connection var ansible_timeout to 10 11683 1726853263.39376: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853263.39386: Set connection var ansible_pipelining to False 11683 1726853263.39394: Set connection var ansible_shell_type to sh 11683 1726853263.39408: Set connection var ansible_connection to ssh 11683 1726853263.39439: variable 'ansible_shell_executable' from source: unknown 11683 1726853263.39447: variable 'ansible_connection' from source: unknown 11683 1726853263.39512: variable 'ansible_module_compression' from source: unknown 11683 1726853263.39515: variable 'ansible_shell_type' from source: unknown 11683 1726853263.39519: variable 'ansible_shell_executable' from source: unknown 11683 1726853263.39522: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.39524: variable 'ansible_pipelining' from source: unknown 11683 1726853263.39526: variable 'ansible_timeout' from source: unknown 11683 1726853263.39528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.39731: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853263.39735: variable 'omit' from source: magic vars 11683 1726853263.39737: starting attempt loop 11683 1726853263.39739: running the handler 11683 1726853263.39817: variable 'interface_stat' from source: set_fact 11683 1726853263.39847: Evaluated conditional (interface_stat.stat.exists): True 11683 1726853263.39862: handler run complete 11683 1726853263.39884: attempt loop complete, returning result 11683 1726853263.39893: _execute() done 11683 1726853263.39900: dumping result to json 11683 1726853263.39907: done dumping result, returning 11683 1726853263.39919: done running TaskExecutor() for managed_node3/TASK: Assert that the interface is present - 'nm-bond' [02083763-bbaf-c5b2-e075-00000000006f] 11683 1726853263.39929: sending task result for task 02083763-bbaf-c5b2-e075-00000000006f ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853263.40121: no more pending results, returning what we have 11683 1726853263.40125: results queue empty 11683 1726853263.40126: checking for any_errors_fatal 11683 1726853263.40135: done checking for any_errors_fatal 11683 1726853263.40136: checking for max_fail_percentage 11683 1726853263.40138: done checking for max_fail_percentage 11683 1726853263.40139: checking to see if all hosts have failed and the running result is not ok 11683 1726853263.40140: done checking to see if all hosts have failed 11683 1726853263.40141: getting the remaining hosts for this loop 11683 1726853263.40143: done getting the remaining hosts for this loop 11683 1726853263.40146: getting the next task for host managed_node3 11683 1726853263.40153: done getting next task for host managed_node3 11683 1726853263.40159: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11683 1726853263.40161: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853263.40165: getting variables 11683 1726853263.40167: in VariableManager get_vars() 11683 1726853263.40211: Calling all_inventory to load vars for managed_node3 11683 1726853263.40214: Calling groups_inventory to load vars for managed_node3 11683 1726853263.40217: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853263.40230: Calling all_plugins_play to load vars for managed_node3 11683 1726853263.40233: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853263.40236: Calling groups_plugins_play to load vars for managed_node3 11683 1726853263.40889: done sending task result for task 02083763-bbaf-c5b2-e075-00000000006f 11683 1726853263.40892: WORKER PROCESS EXITING 11683 1726853263.41954: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853263.43486: done with get_vars() 11683 1726853263.43508: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Friday 20 September 2024 13:27:43 -0400 (0:00:00.062) 0:00:16.508 ****** 11683 1726853263.43604: entering _queue_task() for managed_node3/include_tasks 11683 1726853263.43931: worker is 1 (out of 1 available) 11683 1726853263.43944: exiting _queue_task() for managed_node3/include_tasks 11683 1726853263.44070: done queuing things up, now waiting for results queue to drain 11683 1726853263.44073: waiting for pending results... 11683 1726853263.44249: running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' 11683 1726853263.44358: in run() - task 02083763-bbaf-c5b2-e075-000000000070 11683 1726853263.44381: variable 'ansible_search_path' from source: unknown 11683 1726853263.44441: variable 'controller_profile' from source: play vars 11683 1726853263.44635: variable 'controller_profile' from source: play vars 11683 1726853263.44655: variable 'port1_profile' from source: play vars 11683 1726853263.44732: variable 'port1_profile' from source: play vars 11683 1726853263.44744: variable 'port2_profile' from source: play vars 11683 1726853263.44804: variable 'port2_profile' from source: play vars 11683 1726853263.44819: variable 'omit' from source: magic vars 11683 1726853263.44941: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.44957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.44972: variable 'omit' from source: magic vars 11683 1726853263.45210: variable 'ansible_distribution_major_version' from source: facts 11683 1726853263.45222: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853263.45254: variable 'item' from source: unknown 11683 1726853263.45325: variable 'item' from source: unknown 11683 1726853263.45676: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.45679: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.45682: variable 'omit' from source: magic vars 11683 1726853263.45684: variable 'ansible_distribution_major_version' from source: facts 11683 1726853263.45697: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853263.45726: variable 'item' from source: unknown 11683 1726853263.45790: variable 'item' from source: unknown 11683 1726853263.46176: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.46179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.46182: variable 'omit' from source: magic vars 11683 1726853263.46185: variable 'ansible_distribution_major_version' from source: facts 11683 1726853263.46187: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853263.46189: variable 'item' from source: unknown 11683 1726853263.46190: variable 'item' from source: unknown 11683 1726853263.46249: dumping result to json 11683 1726853263.46259: done dumping result, returning 11683 1726853263.46268: done running TaskExecutor() for managed_node3/TASK: Include the task 'assert_profile_present.yml' [02083763-bbaf-c5b2-e075-000000000070] 11683 1726853263.46280: sending task result for task 02083763-bbaf-c5b2-e075-000000000070 11683 1726853263.46375: no more pending results, returning what we have 11683 1726853263.46380: in VariableManager get_vars() 11683 1726853263.46424: Calling all_inventory to load vars for managed_node3 11683 1726853263.46427: Calling groups_inventory to load vars for managed_node3 11683 1726853263.46429: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853263.46443: Calling all_plugins_play to load vars for managed_node3 11683 1726853263.46446: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853263.46449: Calling groups_plugins_play to load vars for managed_node3 11683 1726853263.47194: done sending task result for task 02083763-bbaf-c5b2-e075-000000000070 11683 1726853263.47198: WORKER PROCESS EXITING 11683 1726853263.48042: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853263.49591: done with get_vars() 11683 1726853263.49618: variable 'ansible_search_path' from source: unknown 11683 1726853263.49637: variable 'ansible_search_path' from source: unknown 11683 1726853263.49647: variable 'ansible_search_path' from source: unknown 11683 1726853263.49654: we have included files to process 11683 1726853263.49655: generating all_blocks data 11683 1726853263.49657: done generating all_blocks data 11683 1726853263.49660: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11683 1726853263.49661: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11683 1726853263.49664: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11683 1726853263.49868: in VariableManager get_vars() 11683 1726853263.49894: done with get_vars() 11683 1726853263.50162: done processing included file 11683 1726853263.50164: iterating over new_blocks loaded from include file 11683 1726853263.50165: in VariableManager get_vars() 11683 1726853263.50185: done with get_vars() 11683 1726853263.50187: filtering new block on tags 11683 1726853263.50206: done filtering new block on tags 11683 1726853263.50208: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0) 11683 1726853263.50213: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11683 1726853263.50214: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11683 1726853263.50217: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11683 1726853263.50317: in VariableManager get_vars() 11683 1726853263.50337: done with get_vars() 11683 1726853263.50567: done processing included file 11683 1726853263.50568: iterating over new_blocks loaded from include file 11683 1726853263.50570: in VariableManager get_vars() 11683 1726853263.50593: done with get_vars() 11683 1726853263.50595: filtering new block on tags 11683 1726853263.50612: done filtering new block on tags 11683 1726853263.50615: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.0) 11683 1726853263.50618: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11683 1726853263.50619: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11683 1726853263.50623: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11683 1726853263.50723: in VariableManager get_vars() 11683 1726853263.50798: done with get_vars() 11683 1726853263.51034: done processing included file 11683 1726853263.51036: iterating over new_blocks loaded from include file 11683 1726853263.51037: in VariableManager get_vars() 11683 1726853263.51053: done with get_vars() 11683 1726853263.51055: filtering new block on tags 11683 1726853263.51074: done filtering new block on tags 11683 1726853263.51076: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node3 => (item=bond0.1) 11683 1726853263.51079: extending task lists for all hosts with included blocks 11683 1726853263.53757: done extending task lists 11683 1726853263.53764: done processing included files 11683 1726853263.53765: results queue empty 11683 1726853263.53766: checking for any_errors_fatal 11683 1726853263.53770: done checking for any_errors_fatal 11683 1726853263.53772: checking for max_fail_percentage 11683 1726853263.53773: done checking for max_fail_percentage 11683 1726853263.53774: checking to see if all hosts have failed and the running result is not ok 11683 1726853263.53775: done checking to see if all hosts have failed 11683 1726853263.53776: getting the remaining hosts for this loop 11683 1726853263.53777: done getting the remaining hosts for this loop 11683 1726853263.53779: getting the next task for host managed_node3 11683 1726853263.53784: done getting next task for host managed_node3 11683 1726853263.53786: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11683 1726853263.53788: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853263.53791: getting variables 11683 1726853263.53792: in VariableManager get_vars() 11683 1726853263.53808: Calling all_inventory to load vars for managed_node3 11683 1726853263.53810: Calling groups_inventory to load vars for managed_node3 11683 1726853263.53812: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853263.53818: Calling all_plugins_play to load vars for managed_node3 11683 1726853263.53822: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853263.53825: Calling groups_plugins_play to load vars for managed_node3 11683 1726853263.59047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853263.60582: done with get_vars() 11683 1726853263.60608: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:27:43 -0400 (0:00:00.170) 0:00:16.678 ****** 11683 1726853263.60690: entering _queue_task() for managed_node3/include_tasks 11683 1726853263.61178: worker is 1 (out of 1 available) 11683 1726853263.61187: exiting _queue_task() for managed_node3/include_tasks 11683 1726853263.61197: done queuing things up, now waiting for results queue to drain 11683 1726853263.61198: waiting for pending results... 11683 1726853263.61362: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11683 1726853263.61486: in run() - task 02083763-bbaf-c5b2-e075-00000000025f 11683 1726853263.61511: variable 'ansible_search_path' from source: unknown 11683 1726853263.61522: variable 'ansible_search_path' from source: unknown 11683 1726853263.61573: calling self._execute() 11683 1726853263.61650: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.61654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.61662: variable 'omit' from source: magic vars 11683 1726853263.61939: variable 'ansible_distribution_major_version' from source: facts 11683 1726853263.61950: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853263.61955: _execute() done 11683 1726853263.61958: dumping result to json 11683 1726853263.61961: done dumping result, returning 11683 1726853263.61968: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-c5b2-e075-00000000025f] 11683 1726853263.61976: sending task result for task 02083763-bbaf-c5b2-e075-00000000025f 11683 1726853263.62060: done sending task result for task 02083763-bbaf-c5b2-e075-00000000025f 11683 1726853263.62062: WORKER PROCESS EXITING 11683 1726853263.62106: no more pending results, returning what we have 11683 1726853263.62111: in VariableManager get_vars() 11683 1726853263.62159: Calling all_inventory to load vars for managed_node3 11683 1726853263.62162: Calling groups_inventory to load vars for managed_node3 11683 1726853263.62164: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853263.62177: Calling all_plugins_play to load vars for managed_node3 11683 1726853263.62180: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853263.62183: Calling groups_plugins_play to load vars for managed_node3 11683 1726853263.62935: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853263.64788: done with get_vars() 11683 1726853263.64803: variable 'ansible_search_path' from source: unknown 11683 1726853263.64804: variable 'ansible_search_path' from source: unknown 11683 1726853263.64830: we have included files to process 11683 1726853263.64831: generating all_blocks data 11683 1726853263.64832: done generating all_blocks data 11683 1726853263.64833: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11683 1726853263.64833: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11683 1726853263.64835: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11683 1726853263.65487: done processing included file 11683 1726853263.65489: iterating over new_blocks loaded from include file 11683 1726853263.65490: in VariableManager get_vars() 11683 1726853263.65502: done with get_vars() 11683 1726853263.65504: filtering new block on tags 11683 1726853263.65518: done filtering new block on tags 11683 1726853263.65519: in VariableManager get_vars() 11683 1726853263.65529: done with get_vars() 11683 1726853263.65530: filtering new block on tags 11683 1726853263.65543: done filtering new block on tags 11683 1726853263.65547: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11683 1726853263.65550: extending task lists for all hosts with included blocks 11683 1726853263.65807: done extending task lists 11683 1726853263.65809: done processing included files 11683 1726853263.65809: results queue empty 11683 1726853263.65810: checking for any_errors_fatal 11683 1726853263.65813: done checking for any_errors_fatal 11683 1726853263.65813: checking for max_fail_percentage 11683 1726853263.65814: done checking for max_fail_percentage 11683 1726853263.65815: checking to see if all hosts have failed and the running result is not ok 11683 1726853263.65816: done checking to see if all hosts have failed 11683 1726853263.65816: getting the remaining hosts for this loop 11683 1726853263.65817: done getting the remaining hosts for this loop 11683 1726853263.65819: getting the next task for host managed_node3 11683 1726853263.65824: done getting next task for host managed_node3 11683 1726853263.65826: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11683 1726853263.65828: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853263.65830: getting variables 11683 1726853263.65831: in VariableManager get_vars() 11683 1726853263.65843: Calling all_inventory to load vars for managed_node3 11683 1726853263.65847: Calling groups_inventory to load vars for managed_node3 11683 1726853263.65849: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853263.65855: Calling all_plugins_play to load vars for managed_node3 11683 1726853263.65857: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853263.65859: Calling groups_plugins_play to load vars for managed_node3 11683 1726853263.67054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853263.69213: done with get_vars() 11683 1726853263.69240: done getting variables 11683 1726853263.69424: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:27:43 -0400 (0:00:00.087) 0:00:16.766 ****** 11683 1726853263.69465: entering _queue_task() for managed_node3/set_fact 11683 1726853263.69739: worker is 1 (out of 1 available) 11683 1726853263.69755: exiting _queue_task() for managed_node3/set_fact 11683 1726853263.69766: done queuing things up, now waiting for results queue to drain 11683 1726853263.69767: waiting for pending results... 11683 1726853263.69938: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11683 1726853263.69999: in run() - task 02083763-bbaf-c5b2-e075-0000000003b0 11683 1726853263.70012: variable 'ansible_search_path' from source: unknown 11683 1726853263.70016: variable 'ansible_search_path' from source: unknown 11683 1726853263.70041: calling self._execute() 11683 1726853263.70117: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.70122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.70128: variable 'omit' from source: magic vars 11683 1726853263.70395: variable 'ansible_distribution_major_version' from source: facts 11683 1726853263.70404: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853263.70409: variable 'omit' from source: magic vars 11683 1726853263.70441: variable 'omit' from source: magic vars 11683 1726853263.70465: variable 'omit' from source: magic vars 11683 1726853263.70496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853263.70522: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853263.70539: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853263.70556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853263.70565: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853263.70589: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853263.70592: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.70594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.70665: Set connection var ansible_shell_executable to /bin/sh 11683 1726853263.70674: Set connection var ansible_timeout to 10 11683 1726853263.70681: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853263.70686: Set connection var ansible_pipelining to False 11683 1726853263.70688: Set connection var ansible_shell_type to sh 11683 1726853263.70691: Set connection var ansible_connection to ssh 11683 1726853263.70706: variable 'ansible_shell_executable' from source: unknown 11683 1726853263.70709: variable 'ansible_connection' from source: unknown 11683 1726853263.70711: variable 'ansible_module_compression' from source: unknown 11683 1726853263.70714: variable 'ansible_shell_type' from source: unknown 11683 1726853263.70716: variable 'ansible_shell_executable' from source: unknown 11683 1726853263.70718: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.70720: variable 'ansible_pipelining' from source: unknown 11683 1726853263.70723: variable 'ansible_timeout' from source: unknown 11683 1726853263.70728: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.70825: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853263.70833: variable 'omit' from source: magic vars 11683 1726853263.70838: starting attempt loop 11683 1726853263.70840: running the handler 11683 1726853263.70851: handler run complete 11683 1726853263.70860: attempt loop complete, returning result 11683 1726853263.70862: _execute() done 11683 1726853263.70864: dumping result to json 11683 1726853263.70867: done dumping result, returning 11683 1726853263.70875: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-c5b2-e075-0000000003b0] 11683 1726853263.70882: sending task result for task 02083763-bbaf-c5b2-e075-0000000003b0 11683 1726853263.70958: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003b0 11683 1726853263.70962: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11683 1726853263.71033: no more pending results, returning what we have 11683 1726853263.71036: results queue empty 11683 1726853263.71037: checking for any_errors_fatal 11683 1726853263.71038: done checking for any_errors_fatal 11683 1726853263.71039: checking for max_fail_percentage 11683 1726853263.71041: done checking for max_fail_percentage 11683 1726853263.71041: checking to see if all hosts have failed and the running result is not ok 11683 1726853263.71042: done checking to see if all hosts have failed 11683 1726853263.71043: getting the remaining hosts for this loop 11683 1726853263.71047: done getting the remaining hosts for this loop 11683 1726853263.71051: getting the next task for host managed_node3 11683 1726853263.71058: done getting next task for host managed_node3 11683 1726853263.71061: ^ task is: TASK: Stat profile file 11683 1726853263.71065: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853263.71069: getting variables 11683 1726853263.71072: in VariableManager get_vars() 11683 1726853263.71147: Calling all_inventory to load vars for managed_node3 11683 1726853263.71165: Calling groups_inventory to load vars for managed_node3 11683 1726853263.71169: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853263.71213: Calling all_plugins_play to load vars for managed_node3 11683 1726853263.71217: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853263.71230: Calling groups_plugins_play to load vars for managed_node3 11683 1726853263.72858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853263.74650: done with get_vars() 11683 1726853263.74677: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:27:43 -0400 (0:00:00.053) 0:00:16.819 ****** 11683 1726853263.74790: entering _queue_task() for managed_node3/stat 11683 1726853263.75127: worker is 1 (out of 1 available) 11683 1726853263.75138: exiting _queue_task() for managed_node3/stat 11683 1726853263.75151: done queuing things up, now waiting for results queue to drain 11683 1726853263.75266: waiting for pending results... 11683 1726853263.75453: running TaskExecutor() for managed_node3/TASK: Stat profile file 11683 1726853263.75707: in run() - task 02083763-bbaf-c5b2-e075-0000000003b1 11683 1726853263.75711: variable 'ansible_search_path' from source: unknown 11683 1726853263.75714: variable 'ansible_search_path' from source: unknown 11683 1726853263.75717: calling self._execute() 11683 1726853263.75752: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.75763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.75779: variable 'omit' from source: magic vars 11683 1726853263.76159: variable 'ansible_distribution_major_version' from source: facts 11683 1726853263.76177: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853263.76188: variable 'omit' from source: magic vars 11683 1726853263.76235: variable 'omit' from source: magic vars 11683 1726853263.76341: variable 'profile' from source: include params 11683 1726853263.76360: variable 'item' from source: include params 11683 1726853263.76427: variable 'item' from source: include params 11683 1726853263.76453: variable 'omit' from source: magic vars 11683 1726853263.76504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853263.76542: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853263.76576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853263.76597: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853263.76614: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853263.76679: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853263.76682: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.76685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.76767: Set connection var ansible_shell_executable to /bin/sh 11683 1726853263.76791: Set connection var ansible_timeout to 10 11683 1726853263.76803: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853263.76811: Set connection var ansible_pipelining to False 11683 1726853263.76877: Set connection var ansible_shell_type to sh 11683 1726853263.76880: Set connection var ansible_connection to ssh 11683 1726853263.76883: variable 'ansible_shell_executable' from source: unknown 11683 1726853263.76885: variable 'ansible_connection' from source: unknown 11683 1726853263.76893: variable 'ansible_module_compression' from source: unknown 11683 1726853263.76895: variable 'ansible_shell_type' from source: unknown 11683 1726853263.76897: variable 'ansible_shell_executable' from source: unknown 11683 1726853263.76900: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853263.76902: variable 'ansible_pipelining' from source: unknown 11683 1726853263.76904: variable 'ansible_timeout' from source: unknown 11683 1726853263.76906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853263.77122: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853263.77139: variable 'omit' from source: magic vars 11683 1726853263.77151: starting attempt loop 11683 1726853263.77157: running the handler 11683 1726853263.77177: _low_level_execute_command(): starting 11683 1726853263.77218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853263.78002: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.78055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853263.78086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.78228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853263.79968: stdout chunk (state=3): >>>/root <<< 11683 1726853263.80043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853263.80277: stderr chunk (state=3): >>><<< 11683 1726853263.80281: stdout chunk (state=3): >>><<< 11683 1726853263.80284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853263.80287: _low_level_execute_command(): starting 11683 1726853263.80292: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117 `" && echo ansible-tmp-1726853263.8012378-12377-179722893446117="` echo /root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117 `" ) && sleep 0' 11683 1726853263.80766: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853263.80776: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853263.80787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853263.80801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853263.80813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853263.80826: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853263.80829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.80849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853263.80852: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853263.80860: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11683 1726853263.80867: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853263.80881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853263.80891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853263.80898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853263.80905: stderr chunk (state=3): >>>debug2: match found <<< 11683 1726853263.80914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.80983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853263.81006: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853263.81009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.81107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853263.83131: stdout chunk (state=3): >>>ansible-tmp-1726853263.8012378-12377-179722893446117=/root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117 <<< 11683 1726853263.83225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853263.83291: stderr chunk (state=3): >>><<< 11683 1726853263.83301: stdout chunk (state=3): >>><<< 11683 1726853263.83332: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853263.8012378-12377-179722893446117=/root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853263.83399: variable 'ansible_module_compression' from source: unknown 11683 1726853263.83478: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11683 1726853263.83690: variable 'ansible_facts' from source: unknown 11683 1726853263.83693: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/AnsiballZ_stat.py 11683 1726853263.83828: Sending initial data 11683 1726853263.83926: Sent initial data (153 bytes) 11683 1726853263.84712: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853263.84715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853263.84717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.84720: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853263.84722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.84781: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853263.84820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.84888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853263.86538: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853263.86587: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853263.86656: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpzr36dg2_ /root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/AnsiballZ_stat.py <<< 11683 1726853263.86660: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/AnsiballZ_stat.py" <<< 11683 1726853263.86754: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpzr36dg2_" to remote "/root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/AnsiballZ_stat.py" <<< 11683 1726853263.87636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853263.87677: stderr chunk (state=3): >>><<< 11683 1726853263.87680: stdout chunk (state=3): >>><<< 11683 1726853263.87741: done transferring module to remote 11683 1726853263.87748: _low_level_execute_command(): starting 11683 1726853263.87752: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/ /root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/AnsiballZ_stat.py && sleep 0' 11683 1726853263.88424: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853263.88676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.88734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853263.88758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.88847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853263.90881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853263.90885: stdout chunk (state=3): >>><<< 11683 1726853263.90887: stderr chunk (state=3): >>><<< 11683 1726853263.90890: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853263.90892: _low_level_execute_command(): starting 11683 1726853263.90894: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/AnsiballZ_stat.py && sleep 0' 11683 1726853263.91518: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853263.91533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853263.91580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853263.91594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.91610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853263.91682: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853263.91719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853263.91751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853263.91797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853263.91863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853264.07790: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11683 1726853264.09598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853264.09602: stdout chunk (state=3): >>><<< 11683 1726853264.09604: stderr chunk (state=3): >>><<< 11683 1726853264.09606: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853264.09609: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853264.09611: _low_level_execute_command(): starting 11683 1726853264.09614: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853263.8012378-12377-179722893446117/ > /dev/null 2>&1 && sleep 0' 11683 1726853264.10579: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853264.10627: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853264.10790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853264.10849: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853264.13182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853264.13186: stdout chunk (state=3): >>><<< 11683 1726853264.13188: stderr chunk (state=3): >>><<< 11683 1726853264.13190: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853264.13193: handler run complete 11683 1726853264.13195: attempt loop complete, returning result 11683 1726853264.13196: _execute() done 11683 1726853264.13198: dumping result to json 11683 1726853264.13356: done dumping result, returning 11683 1726853264.13359: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-c5b2-e075-0000000003b1] 11683 1726853264.13362: sending task result for task 02083763-bbaf-c5b2-e075-0000000003b1 11683 1726853264.13441: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003b1 ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11683 1726853264.13508: no more pending results, returning what we have 11683 1726853264.13512: results queue empty 11683 1726853264.13513: checking for any_errors_fatal 11683 1726853264.13520: done checking for any_errors_fatal 11683 1726853264.13521: checking for max_fail_percentage 11683 1726853264.13523: done checking for max_fail_percentage 11683 1726853264.13523: checking to see if all hosts have failed and the running result is not ok 11683 1726853264.13525: done checking to see if all hosts have failed 11683 1726853264.13527: getting the remaining hosts for this loop 11683 1726853264.13529: done getting the remaining hosts for this loop 11683 1726853264.13533: getting the next task for host managed_node3 11683 1726853264.13540: done getting next task for host managed_node3 11683 1726853264.13543: ^ task is: TASK: Set NM profile exist flag based on the profile files 11683 1726853264.13547: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853264.13551: getting variables 11683 1726853264.13553: in VariableManager get_vars() 11683 1726853264.13606: Calling all_inventory to load vars for managed_node3 11683 1726853264.13609: Calling groups_inventory to load vars for managed_node3 11683 1726853264.13612: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853264.13625: Calling all_plugins_play to load vars for managed_node3 11683 1726853264.13628: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853264.13631: Calling groups_plugins_play to load vars for managed_node3 11683 1726853264.14496: WORKER PROCESS EXITING 11683 1726853264.16346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853264.18894: done with get_vars() 11683 1726853264.18922: done getting variables 11683 1726853264.18986: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:27:44 -0400 (0:00:00.442) 0:00:17.262 ****** 11683 1726853264.19018: entering _queue_task() for managed_node3/set_fact 11683 1726853264.19611: worker is 1 (out of 1 available) 11683 1726853264.19623: exiting _queue_task() for managed_node3/set_fact 11683 1726853264.19635: done queuing things up, now waiting for results queue to drain 11683 1726853264.19636: waiting for pending results... 11683 1726853264.20154: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11683 1726853264.20351: in run() - task 02083763-bbaf-c5b2-e075-0000000003b2 11683 1726853264.20365: variable 'ansible_search_path' from source: unknown 11683 1726853264.20369: variable 'ansible_search_path' from source: unknown 11683 1726853264.20510: calling self._execute() 11683 1726853264.20773: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.20777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.20781: variable 'omit' from source: magic vars 11683 1726853264.21641: variable 'ansible_distribution_major_version' from source: facts 11683 1726853264.21648: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853264.21859: variable 'profile_stat' from source: set_fact 11683 1726853264.21863: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853264.21866: when evaluation is False, skipping this task 11683 1726853264.21868: _execute() done 11683 1726853264.21872: dumping result to json 11683 1726853264.21874: done dumping result, returning 11683 1726853264.21877: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-c5b2-e075-0000000003b2] 11683 1726853264.21880: sending task result for task 02083763-bbaf-c5b2-e075-0000000003b2 11683 1726853264.22013: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003b2 11683 1726853264.22017: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853264.22098: no more pending results, returning what we have 11683 1726853264.22102: results queue empty 11683 1726853264.22103: checking for any_errors_fatal 11683 1726853264.22112: done checking for any_errors_fatal 11683 1726853264.22112: checking for max_fail_percentage 11683 1726853264.22114: done checking for max_fail_percentage 11683 1726853264.22115: checking to see if all hosts have failed and the running result is not ok 11683 1726853264.22116: done checking to see if all hosts have failed 11683 1726853264.22117: getting the remaining hosts for this loop 11683 1726853264.22118: done getting the remaining hosts for this loop 11683 1726853264.22122: getting the next task for host managed_node3 11683 1726853264.22129: done getting next task for host managed_node3 11683 1726853264.22132: ^ task is: TASK: Get NM profile info 11683 1726853264.22137: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853264.22142: getting variables 11683 1726853264.22143: in VariableManager get_vars() 11683 1726853264.22191: Calling all_inventory to load vars for managed_node3 11683 1726853264.22194: Calling groups_inventory to load vars for managed_node3 11683 1726853264.22198: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853264.22212: Calling all_plugins_play to load vars for managed_node3 11683 1726853264.22216: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853264.22219: Calling groups_plugins_play to load vars for managed_node3 11683 1726853264.25278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853264.28374: done with get_vars() 11683 1726853264.28404: done getting variables 11683 1726853264.28466: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:27:44 -0400 (0:00:00.096) 0:00:17.358 ****** 11683 1726853264.28688: entering _queue_task() for managed_node3/shell 11683 1726853264.29006: worker is 1 (out of 1 available) 11683 1726853264.29016: exiting _queue_task() for managed_node3/shell 11683 1726853264.29027: done queuing things up, now waiting for results queue to drain 11683 1726853264.29029: waiting for pending results... 11683 1726853264.29463: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11683 1726853264.29468: in run() - task 02083763-bbaf-c5b2-e075-0000000003b3 11683 1726853264.29475: variable 'ansible_search_path' from source: unknown 11683 1726853264.29478: variable 'ansible_search_path' from source: unknown 11683 1726853264.29481: calling self._execute() 11683 1726853264.29668: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.29675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.29679: variable 'omit' from source: magic vars 11683 1726853264.29954: variable 'ansible_distribution_major_version' from source: facts 11683 1726853264.29965: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853264.29973: variable 'omit' from source: magic vars 11683 1726853264.30019: variable 'omit' from source: magic vars 11683 1726853264.30118: variable 'profile' from source: include params 11683 1726853264.30121: variable 'item' from source: include params 11683 1726853264.30188: variable 'item' from source: include params 11683 1726853264.30322: variable 'omit' from source: magic vars 11683 1726853264.30326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853264.30330: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853264.30332: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853264.30335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853264.30337: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853264.30366: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853264.30370: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.30375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.30538: Set connection var ansible_shell_executable to /bin/sh 11683 1726853264.30541: Set connection var ansible_timeout to 10 11683 1726853264.30546: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853264.30549: Set connection var ansible_pipelining to False 11683 1726853264.30552: Set connection var ansible_shell_type to sh 11683 1726853264.30554: Set connection var ansible_connection to ssh 11683 1726853264.30557: variable 'ansible_shell_executable' from source: unknown 11683 1726853264.30560: variable 'ansible_connection' from source: unknown 11683 1726853264.30562: variable 'ansible_module_compression' from source: unknown 11683 1726853264.30564: variable 'ansible_shell_type' from source: unknown 11683 1726853264.30566: variable 'ansible_shell_executable' from source: unknown 11683 1726853264.30568: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.30572: variable 'ansible_pipelining' from source: unknown 11683 1726853264.30576: variable 'ansible_timeout' from source: unknown 11683 1726853264.30579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.30982: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853264.30986: variable 'omit' from source: magic vars 11683 1726853264.30988: starting attempt loop 11683 1726853264.30991: running the handler 11683 1726853264.30994: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853264.30997: _low_level_execute_command(): starting 11683 1726853264.30999: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853264.31493: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853264.31550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853264.31561: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853264.31584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853264.31674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853264.33390: stdout chunk (state=3): >>>/root <<< 11683 1726853264.33531: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853264.33908: stdout chunk (state=3): >>><<< 11683 1726853264.33911: stderr chunk (state=3): >>><<< 11683 1726853264.33915: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853264.33918: _low_level_execute_command(): starting 11683 1726853264.33921: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852 `" && echo ansible-tmp-1726853264.3381388-12403-171665529117852="` echo /root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852 `" ) && sleep 0' 11683 1726853264.34852: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853264.35179: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853264.35203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853264.35242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853264.35307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853264.37310: stdout chunk (state=3): >>>ansible-tmp-1726853264.3381388-12403-171665529117852=/root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852 <<< 11683 1726853264.37459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853264.37469: stdout chunk (state=3): >>><<< 11683 1726853264.37482: stderr chunk (state=3): >>><<< 11683 1726853264.37776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853264.3381388-12403-171665529117852=/root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853264.37779: variable 'ansible_module_compression' from source: unknown 11683 1726853264.37782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853264.37784: variable 'ansible_facts' from source: unknown 11683 1726853264.37990: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/AnsiballZ_command.py 11683 1726853264.38387: Sending initial data 11683 1726853264.38397: Sent initial data (156 bytes) 11683 1726853264.40177: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853264.40425: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853264.40470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853264.40526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853264.42288: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853264.42368: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853264.42422: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp73ovskom /root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/AnsiballZ_command.py <<< 11683 1726853264.42431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/AnsiballZ_command.py" <<< 11683 1726853264.42506: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp73ovskom" to remote "/root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/AnsiballZ_command.py" <<< 11683 1726853264.43885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853264.43894: stdout chunk (state=3): >>><<< 11683 1726853264.43903: stderr chunk (state=3): >>><<< 11683 1726853264.43953: done transferring module to remote 11683 1726853264.44159: _low_level_execute_command(): starting 11683 1726853264.44163: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/ /root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/AnsiballZ_command.py && sleep 0' 11683 1726853264.45339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853264.45344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853264.45348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853264.45351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853264.45355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853264.45566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853264.45688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853264.47594: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853264.47626: stderr chunk (state=3): >>><<< 11683 1726853264.47875: stdout chunk (state=3): >>><<< 11683 1726853264.47880: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853264.47887: _low_level_execute_command(): starting 11683 1726853264.47890: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/AnsiballZ_command.py && sleep 0' 11683 1726853264.49028: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853264.49041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853264.49053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853264.49083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853264.49289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853264.49293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853264.49392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853264.67302: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:27:44.650244", "end": "2024-09-20 13:27:44.671642", "delta": "0:00:00.021398", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853264.69166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853264.69373: stdout chunk (state=3): >>><<< 11683 1726853264.69380: stderr chunk (state=3): >>><<< 11683 1726853264.69394: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 13:27:44.650244", "end": "2024-09-20 13:27:44.671642", "delta": "0:00:00.021398", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853264.69433: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853264.69440: _low_level_execute_command(): starting 11683 1726853264.69448: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853264.3381388-12403-171665529117852/ > /dev/null 2>&1 && sleep 0' 11683 1726853264.70114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853264.70122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853264.70132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853264.70153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853264.70157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853264.70165: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853264.70177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853264.70190: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853264.70198: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853264.70205: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11683 1726853264.70213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853264.70222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853264.70234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853264.70242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853264.70249: stderr chunk (state=3): >>>debug2: match found <<< 11683 1726853264.70260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853264.70326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853264.70347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853264.70355: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853264.70443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853264.72657: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853264.72661: stdout chunk (state=3): >>><<< 11683 1726853264.72664: stderr chunk (state=3): >>><<< 11683 1726853264.72666: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853264.72669: handler run complete 11683 1726853264.72673: Evaluated conditional (False): False 11683 1726853264.72685: attempt loop complete, returning result 11683 1726853264.72688: _execute() done 11683 1726853264.72690: dumping result to json 11683 1726853264.72692: done dumping result, returning 11683 1726853264.72703: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-c5b2-e075-0000000003b3] 11683 1726853264.72707: sending task result for task 02083763-bbaf-c5b2-e075-0000000003b3 11683 1726853264.72831: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003b3 11683 1726853264.72834: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.021398", "end": "2024-09-20 13:27:44.671642", "rc": 0, "start": "2024-09-20 13:27:44.650244" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 11683 1726853264.73019: no more pending results, returning what we have 11683 1726853264.73023: results queue empty 11683 1726853264.73024: checking for any_errors_fatal 11683 1726853264.73029: done checking for any_errors_fatal 11683 1726853264.73029: checking for max_fail_percentage 11683 1726853264.73031: done checking for max_fail_percentage 11683 1726853264.73032: checking to see if all hosts have failed and the running result is not ok 11683 1726853264.73033: done checking to see if all hosts have failed 11683 1726853264.73033: getting the remaining hosts for this loop 11683 1726853264.73035: done getting the remaining hosts for this loop 11683 1726853264.73038: getting the next task for host managed_node3 11683 1726853264.73046: done getting next task for host managed_node3 11683 1726853264.73048: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11683 1726853264.73052: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853264.73056: getting variables 11683 1726853264.73058: in VariableManager get_vars() 11683 1726853264.73208: Calling all_inventory to load vars for managed_node3 11683 1726853264.73211: Calling groups_inventory to load vars for managed_node3 11683 1726853264.73213: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853264.73224: Calling all_plugins_play to load vars for managed_node3 11683 1726853264.73226: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853264.73229: Calling groups_plugins_play to load vars for managed_node3 11683 1726853264.75064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853264.77041: done with get_vars() 11683 1726853264.77065: done getting variables 11683 1726853264.77152: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:27:44 -0400 (0:00:00.484) 0:00:17.843 ****** 11683 1726853264.77201: entering _queue_task() for managed_node3/set_fact 11683 1726853264.77600: worker is 1 (out of 1 available) 11683 1726853264.77615: exiting _queue_task() for managed_node3/set_fact 11683 1726853264.77629: done queuing things up, now waiting for results queue to drain 11683 1726853264.77630: waiting for pending results... 11683 1726853264.78313: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11683 1726853264.78360: in run() - task 02083763-bbaf-c5b2-e075-0000000003b4 11683 1726853264.78365: variable 'ansible_search_path' from source: unknown 11683 1726853264.78369: variable 'ansible_search_path' from source: unknown 11683 1726853264.78420: calling self._execute() 11683 1726853264.78579: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.78583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.78586: variable 'omit' from source: magic vars 11683 1726853264.78994: variable 'ansible_distribution_major_version' from source: facts 11683 1726853264.79016: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853264.79166: variable 'nm_profile_exists' from source: set_fact 11683 1726853264.79194: Evaluated conditional (nm_profile_exists.rc == 0): True 11683 1726853264.79205: variable 'omit' from source: magic vars 11683 1726853264.79259: variable 'omit' from source: magic vars 11683 1726853264.79339: variable 'omit' from source: magic vars 11683 1726853264.79356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853264.79402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853264.79449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853264.79474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853264.79502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853264.79555: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853264.79559: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.79562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.79667: Set connection var ansible_shell_executable to /bin/sh 11683 1726853264.79686: Set connection var ansible_timeout to 10 11683 1726853264.79719: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853264.79722: Set connection var ansible_pipelining to False 11683 1726853264.79725: Set connection var ansible_shell_type to sh 11683 1726853264.79727: Set connection var ansible_connection to ssh 11683 1726853264.79750: variable 'ansible_shell_executable' from source: unknown 11683 1726853264.79826: variable 'ansible_connection' from source: unknown 11683 1726853264.79831: variable 'ansible_module_compression' from source: unknown 11683 1726853264.79834: variable 'ansible_shell_type' from source: unknown 11683 1726853264.79837: variable 'ansible_shell_executable' from source: unknown 11683 1726853264.79839: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.79841: variable 'ansible_pipelining' from source: unknown 11683 1726853264.79843: variable 'ansible_timeout' from source: unknown 11683 1726853264.79848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.79976: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853264.79998: variable 'omit' from source: magic vars 11683 1726853264.80010: starting attempt loop 11683 1726853264.80018: running the handler 11683 1726853264.80035: handler run complete 11683 1726853264.80094: attempt loop complete, returning result 11683 1726853264.80097: _execute() done 11683 1726853264.80100: dumping result to json 11683 1726853264.80103: done dumping result, returning 11683 1726853264.80105: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-c5b2-e075-0000000003b4] 11683 1726853264.80108: sending task result for task 02083763-bbaf-c5b2-e075-0000000003b4 11683 1726853264.80405: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003b4 11683 1726853264.80408: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11683 1726853264.80463: no more pending results, returning what we have 11683 1726853264.80466: results queue empty 11683 1726853264.80468: checking for any_errors_fatal 11683 1726853264.80477: done checking for any_errors_fatal 11683 1726853264.80478: checking for max_fail_percentage 11683 1726853264.80480: done checking for max_fail_percentage 11683 1726853264.80480: checking to see if all hosts have failed and the running result is not ok 11683 1726853264.80481: done checking to see if all hosts have failed 11683 1726853264.80482: getting the remaining hosts for this loop 11683 1726853264.80484: done getting the remaining hosts for this loop 11683 1726853264.80487: getting the next task for host managed_node3 11683 1726853264.80494: done getting next task for host managed_node3 11683 1726853264.80497: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11683 1726853264.80501: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853264.80505: getting variables 11683 1726853264.80506: in VariableManager get_vars() 11683 1726853264.80552: Calling all_inventory to load vars for managed_node3 11683 1726853264.80554: Calling groups_inventory to load vars for managed_node3 11683 1726853264.80557: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853264.80567: Calling all_plugins_play to load vars for managed_node3 11683 1726853264.80569: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853264.80632: Calling groups_plugins_play to load vars for managed_node3 11683 1726853264.82056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853264.83669: done with get_vars() 11683 1726853264.83699: done getting variables 11683 1726853264.83764: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853264.83916: variable 'profile' from source: include params 11683 1726853264.83920: variable 'item' from source: include params 11683 1726853264.83984: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:27:44 -0400 (0:00:00.068) 0:00:17.912 ****** 11683 1726853264.84027: entering _queue_task() for managed_node3/command 11683 1726853264.84386: worker is 1 (out of 1 available) 11683 1726853264.84399: exiting _queue_task() for managed_node3/command 11683 1726853264.84411: done queuing things up, now waiting for results queue to drain 11683 1726853264.84413: waiting for pending results... 11683 1726853264.84704: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 11683 1726853264.84828: in run() - task 02083763-bbaf-c5b2-e075-0000000003b6 11683 1726853264.84876: variable 'ansible_search_path' from source: unknown 11683 1726853264.84880: variable 'ansible_search_path' from source: unknown 11683 1726853264.84901: calling self._execute() 11683 1726853264.85012: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.85024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.85041: variable 'omit' from source: magic vars 11683 1726853264.85442: variable 'ansible_distribution_major_version' from source: facts 11683 1726853264.85449: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853264.85569: variable 'profile_stat' from source: set_fact 11683 1726853264.85589: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853264.85661: when evaluation is False, skipping this task 11683 1726853264.85665: _execute() done 11683 1726853264.85668: dumping result to json 11683 1726853264.85670: done dumping result, returning 11683 1726853264.85674: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [02083763-bbaf-c5b2-e075-0000000003b6] 11683 1726853264.85676: sending task result for task 02083763-bbaf-c5b2-e075-0000000003b6 11683 1726853264.85743: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003b6 11683 1726853264.85749: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853264.85817: no more pending results, returning what we have 11683 1726853264.85822: results queue empty 11683 1726853264.85823: checking for any_errors_fatal 11683 1726853264.85831: done checking for any_errors_fatal 11683 1726853264.85832: checking for max_fail_percentage 11683 1726853264.85834: done checking for max_fail_percentage 11683 1726853264.85835: checking to see if all hosts have failed and the running result is not ok 11683 1726853264.85836: done checking to see if all hosts have failed 11683 1726853264.85837: getting the remaining hosts for this loop 11683 1726853264.85838: done getting the remaining hosts for this loop 11683 1726853264.85842: getting the next task for host managed_node3 11683 1726853264.85851: done getting next task for host managed_node3 11683 1726853264.85854: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11683 1726853264.85858: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853264.85863: getting variables 11683 1726853264.85865: in VariableManager get_vars() 11683 1726853264.85910: Calling all_inventory to load vars for managed_node3 11683 1726853264.85913: Calling groups_inventory to load vars for managed_node3 11683 1726853264.85916: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853264.85929: Calling all_plugins_play to load vars for managed_node3 11683 1726853264.85932: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853264.85935: Calling groups_plugins_play to load vars for managed_node3 11683 1726853264.87757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853264.89376: done with get_vars() 11683 1726853264.89408: done getting variables 11683 1726853264.89485: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853264.89608: variable 'profile' from source: include params 11683 1726853264.89612: variable 'item' from source: include params 11683 1726853264.89676: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:27:44 -0400 (0:00:00.056) 0:00:17.969 ****** 11683 1726853264.89708: entering _queue_task() for managed_node3/set_fact 11683 1726853264.90064: worker is 1 (out of 1 available) 11683 1726853264.90079: exiting _queue_task() for managed_node3/set_fact 11683 1726853264.90092: done queuing things up, now waiting for results queue to drain 11683 1726853264.90094: waiting for pending results... 11683 1726853264.90390: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 11683 1726853264.90478: in run() - task 02083763-bbaf-c5b2-e075-0000000003b7 11683 1726853264.90482: variable 'ansible_search_path' from source: unknown 11683 1726853264.90485: variable 'ansible_search_path' from source: unknown 11683 1726853264.90496: calling self._execute() 11683 1726853264.90597: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.90601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.90651: variable 'omit' from source: magic vars 11683 1726853264.90984: variable 'ansible_distribution_major_version' from source: facts 11683 1726853264.90996: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853264.91115: variable 'profile_stat' from source: set_fact 11683 1726853264.91127: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853264.91130: when evaluation is False, skipping this task 11683 1726853264.91133: _execute() done 11683 1726853264.91136: dumping result to json 11683 1726853264.91139: done dumping result, returning 11683 1726853264.91178: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [02083763-bbaf-c5b2-e075-0000000003b7] 11683 1726853264.91185: sending task result for task 02083763-bbaf-c5b2-e075-0000000003b7 11683 1726853264.91253: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003b7 11683 1726853264.91256: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853264.91342: no more pending results, returning what we have 11683 1726853264.91348: results queue empty 11683 1726853264.91350: checking for any_errors_fatal 11683 1726853264.91357: done checking for any_errors_fatal 11683 1726853264.91358: checking for max_fail_percentage 11683 1726853264.91360: done checking for max_fail_percentage 11683 1726853264.91360: checking to see if all hosts have failed and the running result is not ok 11683 1726853264.91361: done checking to see if all hosts have failed 11683 1726853264.91362: getting the remaining hosts for this loop 11683 1726853264.91363: done getting the remaining hosts for this loop 11683 1726853264.91366: getting the next task for host managed_node3 11683 1726853264.91374: done getting next task for host managed_node3 11683 1726853264.91376: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11683 1726853264.91380: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853264.91383: getting variables 11683 1726853264.91385: in VariableManager get_vars() 11683 1726853264.91496: Calling all_inventory to load vars for managed_node3 11683 1726853264.91499: Calling groups_inventory to load vars for managed_node3 11683 1726853264.91501: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853264.91515: Calling all_plugins_play to load vars for managed_node3 11683 1726853264.91518: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853264.91521: Calling groups_plugins_play to load vars for managed_node3 11683 1726853264.92782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853264.94350: done with get_vars() 11683 1726853264.94381: done getting variables 11683 1726853264.94450: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853264.94564: variable 'profile' from source: include params 11683 1726853264.94568: variable 'item' from source: include params 11683 1726853264.94630: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:27:44 -0400 (0:00:00.049) 0:00:18.018 ****** 11683 1726853264.94662: entering _queue_task() for managed_node3/command 11683 1726853264.95016: worker is 1 (out of 1 available) 11683 1726853264.95030: exiting _queue_task() for managed_node3/command 11683 1726853264.95043: done queuing things up, now waiting for results queue to drain 11683 1726853264.95045: waiting for pending results... 11683 1726853264.95455: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 11683 1726853264.95462: in run() - task 02083763-bbaf-c5b2-e075-0000000003b8 11683 1726853264.95465: variable 'ansible_search_path' from source: unknown 11683 1726853264.95468: variable 'ansible_search_path' from source: unknown 11683 1726853264.95497: calling self._execute() 11683 1726853264.95596: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853264.95599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853264.95608: variable 'omit' from source: magic vars 11683 1726853264.95978: variable 'ansible_distribution_major_version' from source: facts 11683 1726853264.95999: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853264.96123: variable 'profile_stat' from source: set_fact 11683 1726853264.96177: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853264.96181: when evaluation is False, skipping this task 11683 1726853264.96183: _execute() done 11683 1726853264.96186: dumping result to json 11683 1726853264.96188: done dumping result, returning 11683 1726853264.96191: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0 [02083763-bbaf-c5b2-e075-0000000003b8] 11683 1726853264.96193: sending task result for task 02083763-bbaf-c5b2-e075-0000000003b8 11683 1726853264.96261: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003b8 11683 1726853264.96264: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853264.96323: no more pending results, returning what we have 11683 1726853264.96327: results queue empty 11683 1726853264.96328: checking for any_errors_fatal 11683 1726853264.96335: done checking for any_errors_fatal 11683 1726853264.96336: checking for max_fail_percentage 11683 1726853264.96338: done checking for max_fail_percentage 11683 1726853264.96339: checking to see if all hosts have failed and the running result is not ok 11683 1726853264.96340: done checking to see if all hosts have failed 11683 1726853264.96341: getting the remaining hosts for this loop 11683 1726853264.96342: done getting the remaining hosts for this loop 11683 1726853264.96346: getting the next task for host managed_node3 11683 1726853264.96353: done getting next task for host managed_node3 11683 1726853264.96355: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11683 1726853264.96360: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853264.96364: getting variables 11683 1726853264.96366: in VariableManager get_vars() 11683 1726853264.96413: Calling all_inventory to load vars for managed_node3 11683 1726853264.96416: Calling groups_inventory to load vars for managed_node3 11683 1726853264.96419: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853264.96434: Calling all_plugins_play to load vars for managed_node3 11683 1726853264.96437: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853264.96440: Calling groups_plugins_play to load vars for managed_node3 11683 1726853264.98123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853264.99668: done with get_vars() 11683 1726853264.99697: done getting variables 11683 1726853264.99764: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853264.99880: variable 'profile' from source: include params 11683 1726853264.99884: variable 'item' from source: include params 11683 1726853264.99940: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:27:44 -0400 (0:00:00.053) 0:00:18.071 ****** 11683 1726853264.99979: entering _queue_task() for managed_node3/set_fact 11683 1726853265.00428: worker is 1 (out of 1 available) 11683 1726853265.00440: exiting _queue_task() for managed_node3/set_fact 11683 1726853265.00451: done queuing things up, now waiting for results queue to drain 11683 1726853265.00452: waiting for pending results... 11683 1726853265.00800: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 11683 1726853265.00806: in run() - task 02083763-bbaf-c5b2-e075-0000000003b9 11683 1726853265.00809: variable 'ansible_search_path' from source: unknown 11683 1726853265.00811: variable 'ansible_search_path' from source: unknown 11683 1726853265.00822: calling self._execute() 11683 1726853265.00922: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.00926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.00942: variable 'omit' from source: magic vars 11683 1726853265.01325: variable 'ansible_distribution_major_version' from source: facts 11683 1726853265.01342: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853265.01466: variable 'profile_stat' from source: set_fact 11683 1726853265.01483: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853265.01487: when evaluation is False, skipping this task 11683 1726853265.01490: _execute() done 11683 1726853265.01492: dumping result to json 11683 1726853265.01495: done dumping result, returning 11683 1726853265.01540: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [02083763-bbaf-c5b2-e075-0000000003b9] 11683 1726853265.01547: sending task result for task 02083763-bbaf-c5b2-e075-0000000003b9 11683 1726853265.01610: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003b9 11683 1726853265.01613: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853265.01794: no more pending results, returning what we have 11683 1726853265.01799: results queue empty 11683 1726853265.01800: checking for any_errors_fatal 11683 1726853265.01805: done checking for any_errors_fatal 11683 1726853265.01806: checking for max_fail_percentage 11683 1726853265.01808: done checking for max_fail_percentage 11683 1726853265.01809: checking to see if all hosts have failed and the running result is not ok 11683 1726853265.01810: done checking to see if all hosts have failed 11683 1726853265.01810: getting the remaining hosts for this loop 11683 1726853265.01811: done getting the remaining hosts for this loop 11683 1726853265.01814: getting the next task for host managed_node3 11683 1726853265.01822: done getting next task for host managed_node3 11683 1726853265.01825: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11683 1726853265.01828: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853265.01832: getting variables 11683 1726853265.01833: in VariableManager get_vars() 11683 1726853265.01869: Calling all_inventory to load vars for managed_node3 11683 1726853265.01873: Calling groups_inventory to load vars for managed_node3 11683 1726853265.01876: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853265.01887: Calling all_plugins_play to load vars for managed_node3 11683 1726853265.01890: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853265.01893: Calling groups_plugins_play to load vars for managed_node3 11683 1726853265.03268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853265.05013: done with get_vars() 11683 1726853265.05032: done getting variables 11683 1726853265.05080: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853265.05170: variable 'profile' from source: include params 11683 1726853265.05175: variable 'item' from source: include params 11683 1726853265.05218: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:27:45 -0400 (0:00:00.052) 0:00:18.124 ****** 11683 1726853265.05243: entering _queue_task() for managed_node3/assert 11683 1726853265.05492: worker is 1 (out of 1 available) 11683 1726853265.05506: exiting _queue_task() for managed_node3/assert 11683 1726853265.05519: done queuing things up, now waiting for results queue to drain 11683 1726853265.05520: waiting for pending results... 11683 1726853265.05702: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' 11683 1726853265.05776: in run() - task 02083763-bbaf-c5b2-e075-000000000260 11683 1726853265.05786: variable 'ansible_search_path' from source: unknown 11683 1726853265.05791: variable 'ansible_search_path' from source: unknown 11683 1726853265.05818: calling self._execute() 11683 1726853265.05896: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.05900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.05908: variable 'omit' from source: magic vars 11683 1726853265.06181: variable 'ansible_distribution_major_version' from source: facts 11683 1726853265.06192: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853265.06196: variable 'omit' from source: magic vars 11683 1726853265.06224: variable 'omit' from source: magic vars 11683 1726853265.06299: variable 'profile' from source: include params 11683 1726853265.06303: variable 'item' from source: include params 11683 1726853265.06346: variable 'item' from source: include params 11683 1726853265.06361: variable 'omit' from source: magic vars 11683 1726853265.06395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853265.06424: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853265.06440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853265.06456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.06466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.06490: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853265.06494: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.06496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.06569: Set connection var ansible_shell_executable to /bin/sh 11683 1726853265.06579: Set connection var ansible_timeout to 10 11683 1726853265.06586: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853265.06590: Set connection var ansible_pipelining to False 11683 1726853265.06593: Set connection var ansible_shell_type to sh 11683 1726853265.06595: Set connection var ansible_connection to ssh 11683 1726853265.06612: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.06614: variable 'ansible_connection' from source: unknown 11683 1726853265.06618: variable 'ansible_module_compression' from source: unknown 11683 1726853265.06622: variable 'ansible_shell_type' from source: unknown 11683 1726853265.06624: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.06626: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.06630: variable 'ansible_pipelining' from source: unknown 11683 1726853265.06632: variable 'ansible_timeout' from source: unknown 11683 1726853265.06635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.06765: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853265.06775: variable 'omit' from source: magic vars 11683 1726853265.06781: starting attempt loop 11683 1726853265.06784: running the handler 11683 1726853265.06902: variable 'lsr_net_profile_exists' from source: set_fact 11683 1726853265.06905: Evaluated conditional (lsr_net_profile_exists): True 11683 1726853265.06908: handler run complete 11683 1726853265.06910: attempt loop complete, returning result 11683 1726853265.06912: _execute() done 11683 1726853265.06915: dumping result to json 11683 1726853265.06918: done dumping result, returning 11683 1726853265.06942: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0' [02083763-bbaf-c5b2-e075-000000000260] 11683 1726853265.06945: sending task result for task 02083763-bbaf-c5b2-e075-000000000260 11683 1726853265.07025: done sending task result for task 02083763-bbaf-c5b2-e075-000000000260 11683 1726853265.07027: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853265.07099: no more pending results, returning what we have 11683 1726853265.07102: results queue empty 11683 1726853265.07103: checking for any_errors_fatal 11683 1726853265.07109: done checking for any_errors_fatal 11683 1726853265.07109: checking for max_fail_percentage 11683 1726853265.07111: done checking for max_fail_percentage 11683 1726853265.07112: checking to see if all hosts have failed and the running result is not ok 11683 1726853265.07113: done checking to see if all hosts have failed 11683 1726853265.07114: getting the remaining hosts for this loop 11683 1726853265.07115: done getting the remaining hosts for this loop 11683 1726853265.07118: getting the next task for host managed_node3 11683 1726853265.07124: done getting next task for host managed_node3 11683 1726853265.07126: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11683 1726853265.07129: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853265.07133: getting variables 11683 1726853265.07134: in VariableManager get_vars() 11683 1726853265.07178: Calling all_inventory to load vars for managed_node3 11683 1726853265.07180: Calling groups_inventory to load vars for managed_node3 11683 1726853265.07183: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853265.07192: Calling all_plugins_play to load vars for managed_node3 11683 1726853265.07194: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853265.07196: Calling groups_plugins_play to load vars for managed_node3 11683 1726853265.08500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853265.09361: done with get_vars() 11683 1726853265.09381: done getting variables 11683 1726853265.09424: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853265.09512: variable 'profile' from source: include params 11683 1726853265.09516: variable 'item' from source: include params 11683 1726853265.09557: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:27:45 -0400 (0:00:00.043) 0:00:18.167 ****** 11683 1726853265.09586: entering _queue_task() for managed_node3/assert 11683 1726853265.09832: worker is 1 (out of 1 available) 11683 1726853265.09848: exiting _queue_task() for managed_node3/assert 11683 1726853265.09861: done queuing things up, now waiting for results queue to drain 11683 1726853265.09862: waiting for pending results... 11683 1726853265.10040: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' 11683 1726853265.10108: in run() - task 02083763-bbaf-c5b2-e075-000000000261 11683 1726853265.10119: variable 'ansible_search_path' from source: unknown 11683 1726853265.10123: variable 'ansible_search_path' from source: unknown 11683 1726853265.10151: calling self._execute() 11683 1726853265.10228: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.10233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.10242: variable 'omit' from source: magic vars 11683 1726853265.10777: variable 'ansible_distribution_major_version' from source: facts 11683 1726853265.10780: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853265.10783: variable 'omit' from source: magic vars 11683 1726853265.10785: variable 'omit' from source: magic vars 11683 1726853265.10788: variable 'profile' from source: include params 11683 1726853265.10791: variable 'item' from source: include params 11683 1726853265.10807: variable 'item' from source: include params 11683 1726853265.10830: variable 'omit' from source: magic vars 11683 1726853265.10879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853265.10920: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853265.10951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853265.10975: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.10994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.11029: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853265.11038: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.11048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.11151: Set connection var ansible_shell_executable to /bin/sh 11683 1726853265.11168: Set connection var ansible_timeout to 10 11683 1726853265.11183: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853265.11193: Set connection var ansible_pipelining to False 11683 1726853265.11200: Set connection var ansible_shell_type to sh 11683 1726853265.11206: Set connection var ansible_connection to ssh 11683 1726853265.11230: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.11239: variable 'ansible_connection' from source: unknown 11683 1726853265.11248: variable 'ansible_module_compression' from source: unknown 11683 1726853265.11255: variable 'ansible_shell_type' from source: unknown 11683 1726853265.11260: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.11266: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.11275: variable 'ansible_pipelining' from source: unknown 11683 1726853265.11283: variable 'ansible_timeout' from source: unknown 11683 1726853265.11291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.11432: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853265.11453: variable 'omit' from source: magic vars 11683 1726853265.11465: starting attempt loop 11683 1726853265.11474: running the handler 11683 1726853265.11585: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11683 1726853265.11596: Evaluated conditional (lsr_net_profile_ansible_managed): True 11683 1726853265.11606: handler run complete 11683 1726853265.11624: attempt loop complete, returning result 11683 1726853265.11632: _execute() done 11683 1726853265.11638: dumping result to json 11683 1726853265.11650: done dumping result, returning 11683 1726853265.11662: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0' [02083763-bbaf-c5b2-e075-000000000261] 11683 1726853265.11674: sending task result for task 02083763-bbaf-c5b2-e075-000000000261 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853265.11827: no more pending results, returning what we have 11683 1726853265.11830: results queue empty 11683 1726853265.11831: checking for any_errors_fatal 11683 1726853265.11837: done checking for any_errors_fatal 11683 1726853265.11837: checking for max_fail_percentage 11683 1726853265.11839: done checking for max_fail_percentage 11683 1726853265.11840: checking to see if all hosts have failed and the running result is not ok 11683 1726853265.11841: done checking to see if all hosts have failed 11683 1726853265.11842: getting the remaining hosts for this loop 11683 1726853265.11844: done getting the remaining hosts for this loop 11683 1726853265.11847: getting the next task for host managed_node3 11683 1726853265.11852: done getting next task for host managed_node3 11683 1726853265.11854: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11683 1726853265.11857: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853265.11861: getting variables 11683 1726853265.11862: in VariableManager get_vars() 11683 1726853265.11906: Calling all_inventory to load vars for managed_node3 11683 1726853265.11909: Calling groups_inventory to load vars for managed_node3 11683 1726853265.11911: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853265.11924: Calling all_plugins_play to load vars for managed_node3 11683 1726853265.11926: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853265.11929: Calling groups_plugins_play to load vars for managed_node3 11683 1726853265.12452: done sending task result for task 02083763-bbaf-c5b2-e075-000000000261 11683 1726853265.12456: WORKER PROCESS EXITING 11683 1726853265.13558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853265.15140: done with get_vars() 11683 1726853265.15169: done getting variables 11683 1726853265.15235: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853265.15348: variable 'profile' from source: include params 11683 1726853265.15352: variable 'item' from source: include params 11683 1726853265.15411: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:27:45 -0400 (0:00:00.058) 0:00:18.226 ****** 11683 1726853265.15452: entering _queue_task() for managed_node3/assert 11683 1726853265.15784: worker is 1 (out of 1 available) 11683 1726853265.15797: exiting _queue_task() for managed_node3/assert 11683 1726853265.15809: done queuing things up, now waiting for results queue to drain 11683 1726853265.15810: waiting for pending results... 11683 1726853265.16063: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 11683 1726853265.16197: in run() - task 02083763-bbaf-c5b2-e075-000000000262 11683 1726853265.16202: variable 'ansible_search_path' from source: unknown 11683 1726853265.16377: variable 'ansible_search_path' from source: unknown 11683 1726853265.16382: calling self._execute() 11683 1726853265.16384: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.16386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.16389: variable 'omit' from source: magic vars 11683 1726853265.16743: variable 'ansible_distribution_major_version' from source: facts 11683 1726853265.16763: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853265.16776: variable 'omit' from source: magic vars 11683 1726853265.16820: variable 'omit' from source: magic vars 11683 1726853265.16929: variable 'profile' from source: include params 11683 1726853265.16942: variable 'item' from source: include params 11683 1726853265.17008: variable 'item' from source: include params 11683 1726853265.17033: variable 'omit' from source: magic vars 11683 1726853265.17084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853265.17126: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853265.17162: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853265.17186: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.17205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.17242: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853265.17256: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.17273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.17484: Set connection var ansible_shell_executable to /bin/sh 11683 1726853265.17487: Set connection var ansible_timeout to 10 11683 1726853265.17489: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853265.17491: Set connection var ansible_pipelining to False 11683 1726853265.17493: Set connection var ansible_shell_type to sh 11683 1726853265.17496: Set connection var ansible_connection to ssh 11683 1726853265.17497: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.17499: variable 'ansible_connection' from source: unknown 11683 1726853265.17501: variable 'ansible_module_compression' from source: unknown 11683 1726853265.17503: variable 'ansible_shell_type' from source: unknown 11683 1726853265.17505: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.17506: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.17508: variable 'ansible_pipelining' from source: unknown 11683 1726853265.17510: variable 'ansible_timeout' from source: unknown 11683 1726853265.17512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.17641: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853265.17660: variable 'omit' from source: magic vars 11683 1726853265.17670: starting attempt loop 11683 1726853265.17679: running the handler 11683 1726853265.17789: variable 'lsr_net_profile_fingerprint' from source: set_fact 11683 1726853265.17799: Evaluated conditional (lsr_net_profile_fingerprint): True 11683 1726853265.17813: handler run complete 11683 1726853265.17834: attempt loop complete, returning result 11683 1726853265.17843: _execute() done 11683 1726853265.17854: dumping result to json 11683 1726853265.17862: done dumping result, returning 11683 1726853265.17875: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0 [02083763-bbaf-c5b2-e075-000000000262] 11683 1726853265.17887: sending task result for task 02083763-bbaf-c5b2-e075-000000000262 11683 1726853265.18276: done sending task result for task 02083763-bbaf-c5b2-e075-000000000262 11683 1726853265.18280: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853265.18319: no more pending results, returning what we have 11683 1726853265.18322: results queue empty 11683 1726853265.18323: checking for any_errors_fatal 11683 1726853265.18328: done checking for any_errors_fatal 11683 1726853265.18329: checking for max_fail_percentage 11683 1726853265.18330: done checking for max_fail_percentage 11683 1726853265.18331: checking to see if all hosts have failed and the running result is not ok 11683 1726853265.18332: done checking to see if all hosts have failed 11683 1726853265.18333: getting the remaining hosts for this loop 11683 1726853265.18334: done getting the remaining hosts for this loop 11683 1726853265.18337: getting the next task for host managed_node3 11683 1726853265.18348: done getting next task for host managed_node3 11683 1726853265.18351: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11683 1726853265.18354: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853265.18358: getting variables 11683 1726853265.18359: in VariableManager get_vars() 11683 1726853265.18397: Calling all_inventory to load vars for managed_node3 11683 1726853265.18400: Calling groups_inventory to load vars for managed_node3 11683 1726853265.18402: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853265.18412: Calling all_plugins_play to load vars for managed_node3 11683 1726853265.18414: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853265.18417: Calling groups_plugins_play to load vars for managed_node3 11683 1726853265.19725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853265.21309: done with get_vars() 11683 1726853265.21336: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:27:45 -0400 (0:00:00.059) 0:00:18.286 ****** 11683 1726853265.21435: entering _queue_task() for managed_node3/include_tasks 11683 1726853265.21770: worker is 1 (out of 1 available) 11683 1726853265.21785: exiting _queue_task() for managed_node3/include_tasks 11683 1726853265.21797: done queuing things up, now waiting for results queue to drain 11683 1726853265.21798: waiting for pending results... 11683 1726853265.22077: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11683 1726853265.22192: in run() - task 02083763-bbaf-c5b2-e075-000000000266 11683 1726853265.22215: variable 'ansible_search_path' from source: unknown 11683 1726853265.22223: variable 'ansible_search_path' from source: unknown 11683 1726853265.22264: calling self._execute() 11683 1726853265.22364: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.22377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.22390: variable 'omit' from source: magic vars 11683 1726853265.22769: variable 'ansible_distribution_major_version' from source: facts 11683 1726853265.22789: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853265.22801: _execute() done 11683 1726853265.22809: dumping result to json 11683 1726853265.22817: done dumping result, returning 11683 1726853265.22827: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-c5b2-e075-000000000266] 11683 1726853265.22837: sending task result for task 02083763-bbaf-c5b2-e075-000000000266 11683 1726853265.22985: no more pending results, returning what we have 11683 1726853265.22990: in VariableManager get_vars() 11683 1726853265.23042: Calling all_inventory to load vars for managed_node3 11683 1726853265.23047: Calling groups_inventory to load vars for managed_node3 11683 1726853265.23050: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853265.23064: Calling all_plugins_play to load vars for managed_node3 11683 1726853265.23068: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853265.23073: Calling groups_plugins_play to load vars for managed_node3 11683 1726853265.23784: done sending task result for task 02083763-bbaf-c5b2-e075-000000000266 11683 1726853265.23787: WORKER PROCESS EXITING 11683 1726853265.24785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853265.26330: done with get_vars() 11683 1726853265.26352: variable 'ansible_search_path' from source: unknown 11683 1726853265.26353: variable 'ansible_search_path' from source: unknown 11683 1726853265.26392: we have included files to process 11683 1726853265.26394: generating all_blocks data 11683 1726853265.26396: done generating all_blocks data 11683 1726853265.26399: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11683 1726853265.26401: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11683 1726853265.26403: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11683 1726853265.27315: done processing included file 11683 1726853265.27317: iterating over new_blocks loaded from include file 11683 1726853265.27318: in VariableManager get_vars() 11683 1726853265.27338: done with get_vars() 11683 1726853265.27340: filtering new block on tags 11683 1726853265.27367: done filtering new block on tags 11683 1726853265.27372: in VariableManager get_vars() 11683 1726853265.27391: done with get_vars() 11683 1726853265.27392: filtering new block on tags 11683 1726853265.27414: done filtering new block on tags 11683 1726853265.27416: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11683 1726853265.27422: extending task lists for all hosts with included blocks 11683 1726853265.27601: done extending task lists 11683 1726853265.27603: done processing included files 11683 1726853265.27603: results queue empty 11683 1726853265.27604: checking for any_errors_fatal 11683 1726853265.27607: done checking for any_errors_fatal 11683 1726853265.27608: checking for max_fail_percentage 11683 1726853265.27609: done checking for max_fail_percentage 11683 1726853265.27609: checking to see if all hosts have failed and the running result is not ok 11683 1726853265.27610: done checking to see if all hosts have failed 11683 1726853265.27611: getting the remaining hosts for this loop 11683 1726853265.27612: done getting the remaining hosts for this loop 11683 1726853265.27614: getting the next task for host managed_node3 11683 1726853265.27618: done getting next task for host managed_node3 11683 1726853265.27620: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11683 1726853265.27623: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853265.27626: getting variables 11683 1726853265.27627: in VariableManager get_vars() 11683 1726853265.27639: Calling all_inventory to load vars for managed_node3 11683 1726853265.27641: Calling groups_inventory to load vars for managed_node3 11683 1726853265.27646: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853265.27651: Calling all_plugins_play to load vars for managed_node3 11683 1726853265.27654: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853265.27657: Calling groups_plugins_play to load vars for managed_node3 11683 1726853265.28821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853265.32166: done with get_vars() 11683 1726853265.32193: done getting variables 11683 1726853265.32240: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:27:45 -0400 (0:00:00.108) 0:00:18.394 ****** 11683 1726853265.32278: entering _queue_task() for managed_node3/set_fact 11683 1726853265.32922: worker is 1 (out of 1 available) 11683 1726853265.32933: exiting _queue_task() for managed_node3/set_fact 11683 1726853265.32946: done queuing things up, now waiting for results queue to drain 11683 1726853265.32947: waiting for pending results... 11683 1726853265.33123: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11683 1726853265.33281: in run() - task 02083763-bbaf-c5b2-e075-0000000003f8 11683 1726853265.33303: variable 'ansible_search_path' from source: unknown 11683 1726853265.33310: variable 'ansible_search_path' from source: unknown 11683 1726853265.33355: calling self._execute() 11683 1726853265.33464: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.33480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.33499: variable 'omit' from source: magic vars 11683 1726853265.33929: variable 'ansible_distribution_major_version' from source: facts 11683 1726853265.33932: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853265.33935: variable 'omit' from source: magic vars 11683 1726853265.33993: variable 'omit' from source: magic vars 11683 1726853265.34051: variable 'omit' from source: magic vars 11683 1726853265.34113: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853265.34154: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853265.34222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853265.34226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.34228: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.34263: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853265.34275: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.34285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.34403: Set connection var ansible_shell_executable to /bin/sh 11683 1726853265.34470: Set connection var ansible_timeout to 10 11683 1726853265.34475: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853265.34574: Set connection var ansible_pipelining to False 11683 1726853265.34577: Set connection var ansible_shell_type to sh 11683 1726853265.34580: Set connection var ansible_connection to ssh 11683 1726853265.34582: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.34586: variable 'ansible_connection' from source: unknown 11683 1726853265.34589: variable 'ansible_module_compression' from source: unknown 11683 1726853265.34590: variable 'ansible_shell_type' from source: unknown 11683 1726853265.34663: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.34666: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.34669: variable 'ansible_pipelining' from source: unknown 11683 1726853265.34673: variable 'ansible_timeout' from source: unknown 11683 1726853265.34676: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.34804: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853265.34952: variable 'omit' from source: magic vars 11683 1726853265.34956: starting attempt loop 11683 1726853265.34958: running the handler 11683 1726853265.34960: handler run complete 11683 1726853265.34963: attempt loop complete, returning result 11683 1726853265.34965: _execute() done 11683 1726853265.34967: dumping result to json 11683 1726853265.35183: done dumping result, returning 11683 1726853265.35186: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-c5b2-e075-0000000003f8] 11683 1726853265.35189: sending task result for task 02083763-bbaf-c5b2-e075-0000000003f8 11683 1726853265.35255: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003f8 11683 1726853265.35258: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11683 1726853265.35348: no more pending results, returning what we have 11683 1726853265.35353: results queue empty 11683 1726853265.35354: checking for any_errors_fatal 11683 1726853265.35356: done checking for any_errors_fatal 11683 1726853265.35357: checking for max_fail_percentage 11683 1726853265.35359: done checking for max_fail_percentage 11683 1726853265.35360: checking to see if all hosts have failed and the running result is not ok 11683 1726853265.35361: done checking to see if all hosts have failed 11683 1726853265.35362: getting the remaining hosts for this loop 11683 1726853265.35364: done getting the remaining hosts for this loop 11683 1726853265.35368: getting the next task for host managed_node3 11683 1726853265.35377: done getting next task for host managed_node3 11683 1726853265.35380: ^ task is: TASK: Stat profile file 11683 1726853265.35384: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853265.35389: getting variables 11683 1726853265.35392: in VariableManager get_vars() 11683 1726853265.35441: Calling all_inventory to load vars for managed_node3 11683 1726853265.35446: Calling groups_inventory to load vars for managed_node3 11683 1726853265.35450: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853265.35463: Calling all_plugins_play to load vars for managed_node3 11683 1726853265.35466: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853265.35469: Calling groups_plugins_play to load vars for managed_node3 11683 1726853265.38546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853265.41462: done with get_vars() 11683 1726853265.41602: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:27:45 -0400 (0:00:00.094) 0:00:18.489 ****** 11683 1726853265.41713: entering _queue_task() for managed_node3/stat 11683 1726853265.42583: worker is 1 (out of 1 available) 11683 1726853265.42596: exiting _queue_task() for managed_node3/stat 11683 1726853265.42608: done queuing things up, now waiting for results queue to drain 11683 1726853265.42610: waiting for pending results... 11683 1726853265.43290: running TaskExecutor() for managed_node3/TASK: Stat profile file 11683 1726853265.43294: in run() - task 02083763-bbaf-c5b2-e075-0000000003f9 11683 1726853265.43298: variable 'ansible_search_path' from source: unknown 11683 1726853265.43301: variable 'ansible_search_path' from source: unknown 11683 1726853265.43449: calling self._execute() 11683 1726853265.43603: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.43607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.43620: variable 'omit' from source: magic vars 11683 1726853265.44435: variable 'ansible_distribution_major_version' from source: facts 11683 1726853265.44449: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853265.44455: variable 'omit' from source: magic vars 11683 1726853265.44633: variable 'omit' from source: magic vars 11683 1726853265.44793: variable 'profile' from source: include params 11683 1726853265.44796: variable 'item' from source: include params 11683 1726853265.45023: variable 'item' from source: include params 11683 1726853265.45046: variable 'omit' from source: magic vars 11683 1726853265.45097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853265.45134: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853265.45157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853265.45175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.45284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.45320: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853265.45323: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.45326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.45540: Set connection var ansible_shell_executable to /bin/sh 11683 1726853265.45551: Set connection var ansible_timeout to 10 11683 1726853265.45559: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853265.45567: Set connection var ansible_pipelining to False 11683 1726853265.45569: Set connection var ansible_shell_type to sh 11683 1726853265.45573: Set connection var ansible_connection to ssh 11683 1726853265.45596: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.45599: variable 'ansible_connection' from source: unknown 11683 1726853265.45601: variable 'ansible_module_compression' from source: unknown 11683 1726853265.45604: variable 'ansible_shell_type' from source: unknown 11683 1726853265.45606: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.45608: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.45614: variable 'ansible_pipelining' from source: unknown 11683 1726853265.45617: variable 'ansible_timeout' from source: unknown 11683 1726853265.45784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.46128: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853265.46139: variable 'omit' from source: magic vars 11683 1726853265.46147: starting attempt loop 11683 1726853265.46150: running the handler 11683 1726853265.46387: _low_level_execute_command(): starting 11683 1726853265.46439: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853265.47618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853265.47622: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853265.47624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853265.47627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853265.47629: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853265.47631: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853265.47664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853265.47674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853265.47876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853265.49505: stdout chunk (state=3): >>>/root <<< 11683 1726853265.49776: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853265.49780: stderr chunk (state=3): >>><<< 11683 1726853265.49783: stdout chunk (state=3): >>><<< 11683 1726853265.49786: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853265.49789: _low_level_execute_command(): starting 11683 1726853265.49792: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710 `" && echo ansible-tmp-1726853265.4967237-12445-85742965998710="` echo /root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710 `" ) && sleep 0' 11683 1726853265.50442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853265.50456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853265.50461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853265.50479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853265.50676: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853265.50679: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853265.50681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.50683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853265.50686: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853265.50687: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11683 1726853265.50689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853265.50691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853265.50693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853265.50695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853265.50697: stderr chunk (state=3): >>>debug2: match found <<< 11683 1726853265.50699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.50701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853265.50706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853265.50900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853265.50975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853265.52983: stdout chunk (state=3): >>>ansible-tmp-1726853265.4967237-12445-85742965998710=/root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710 <<< 11683 1726853265.53127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853265.53140: stderr chunk (state=3): >>><<< 11683 1726853265.53145: stdout chunk (state=3): >>><<< 11683 1726853265.53164: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853265.4967237-12445-85742965998710=/root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853265.53215: variable 'ansible_module_compression' from source: unknown 11683 1726853265.53279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11683 1726853265.53326: variable 'ansible_facts' from source: unknown 11683 1726853265.53631: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/AnsiballZ_stat.py 11683 1726853265.53825: Sending initial data 11683 1726853265.53828: Sent initial data (152 bytes) 11683 1726853265.54487: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853265.54582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853265.54589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853265.54681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853265.56528: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853265.56570: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853265.56651: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp03c7ypwj /root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/AnsiballZ_stat.py <<< 11683 1726853265.56678: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/AnsiballZ_stat.py" <<< 11683 1726853265.56762: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp03c7ypwj" to remote "/root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/AnsiballZ_stat.py" <<< 11683 1726853265.57730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853265.57977: stderr chunk (state=3): >>><<< 11683 1726853265.57980: stdout chunk (state=3): >>><<< 11683 1726853265.57983: done transferring module to remote 11683 1726853265.57985: _low_level_execute_command(): starting 11683 1726853265.57987: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/ /root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/AnsiballZ_stat.py && sleep 0' 11683 1726853265.58429: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853265.58438: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853265.58448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853265.58459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853265.58474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853265.58481: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853265.58656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853265.58793: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853265.58875: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853265.60779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853265.60783: stdout chunk (state=3): >>><<< 11683 1726853265.60785: stderr chunk (state=3): >>><<< 11683 1726853265.60928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853265.60932: _low_level_execute_command(): starting 11683 1726853265.60935: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/AnsiballZ_stat.py && sleep 0' 11683 1726853265.61610: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853265.61615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.61617: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853265.61620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853265.61622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.61688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853265.61700: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853265.61732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853265.61812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853265.77436: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11683 1726853265.78799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853265.78825: stderr chunk (state=3): >>><<< 11683 1726853265.78829: stdout chunk (state=3): >>><<< 11683 1726853265.78846: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853265.78873: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853265.78883: _low_level_execute_command(): starting 11683 1726853265.78888: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853265.4967237-12445-85742965998710/ > /dev/null 2>&1 && sleep 0' 11683 1726853265.79346: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853265.79355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.79358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 11683 1726853265.79361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853265.79363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.79410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853265.79416: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853265.79422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853265.79509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853265.81361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853265.81385: stderr chunk (state=3): >>><<< 11683 1726853265.81388: stdout chunk (state=3): >>><<< 11683 1726853265.81403: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853265.81409: handler run complete 11683 1726853265.81427: attempt loop complete, returning result 11683 1726853265.81430: _execute() done 11683 1726853265.81433: dumping result to json 11683 1726853265.81435: done dumping result, returning 11683 1726853265.81445: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-c5b2-e075-0000000003f9] 11683 1726853265.81448: sending task result for task 02083763-bbaf-c5b2-e075-0000000003f9 11683 1726853265.81539: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003f9 11683 1726853265.81542: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11683 1726853265.81602: no more pending results, returning what we have 11683 1726853265.81606: results queue empty 11683 1726853265.81607: checking for any_errors_fatal 11683 1726853265.81613: done checking for any_errors_fatal 11683 1726853265.81613: checking for max_fail_percentage 11683 1726853265.81615: done checking for max_fail_percentage 11683 1726853265.81616: checking to see if all hosts have failed and the running result is not ok 11683 1726853265.81618: done checking to see if all hosts have failed 11683 1726853265.81618: getting the remaining hosts for this loop 11683 1726853265.81620: done getting the remaining hosts for this loop 11683 1726853265.81623: getting the next task for host managed_node3 11683 1726853265.81629: done getting next task for host managed_node3 11683 1726853265.81631: ^ task is: TASK: Set NM profile exist flag based on the profile files 11683 1726853265.81635: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853265.81639: getting variables 11683 1726853265.81640: in VariableManager get_vars() 11683 1726853265.81686: Calling all_inventory to load vars for managed_node3 11683 1726853265.81689: Calling groups_inventory to load vars for managed_node3 11683 1726853265.81691: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853265.81703: Calling all_plugins_play to load vars for managed_node3 11683 1726853265.81706: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853265.81708: Calling groups_plugins_play to load vars for managed_node3 11683 1726853265.82612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853265.86675: done with get_vars() 11683 1726853265.86693: done getting variables 11683 1726853265.86727: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:27:45 -0400 (0:00:00.450) 0:00:18.939 ****** 11683 1726853265.86748: entering _queue_task() for managed_node3/set_fact 11683 1726853265.87011: worker is 1 (out of 1 available) 11683 1726853265.87026: exiting _queue_task() for managed_node3/set_fact 11683 1726853265.87037: done queuing things up, now waiting for results queue to drain 11683 1726853265.87038: waiting for pending results... 11683 1726853265.87219: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11683 1726853265.87305: in run() - task 02083763-bbaf-c5b2-e075-0000000003fa 11683 1726853265.87317: variable 'ansible_search_path' from source: unknown 11683 1726853265.87321: variable 'ansible_search_path' from source: unknown 11683 1726853265.87349: calling self._execute() 11683 1726853265.87420: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.87425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.87436: variable 'omit' from source: magic vars 11683 1726853265.87720: variable 'ansible_distribution_major_version' from source: facts 11683 1726853265.87730: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853265.87817: variable 'profile_stat' from source: set_fact 11683 1726853265.87827: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853265.87831: when evaluation is False, skipping this task 11683 1726853265.87834: _execute() done 11683 1726853265.87836: dumping result to json 11683 1726853265.87838: done dumping result, returning 11683 1726853265.87846: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-c5b2-e075-0000000003fa] 11683 1726853265.87850: sending task result for task 02083763-bbaf-c5b2-e075-0000000003fa 11683 1726853265.87931: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003fa 11683 1726853265.87935: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853265.88015: no more pending results, returning what we have 11683 1726853265.88019: results queue empty 11683 1726853265.88020: checking for any_errors_fatal 11683 1726853265.88030: done checking for any_errors_fatal 11683 1726853265.88031: checking for max_fail_percentage 11683 1726853265.88033: done checking for max_fail_percentage 11683 1726853265.88033: checking to see if all hosts have failed and the running result is not ok 11683 1726853265.88034: done checking to see if all hosts have failed 11683 1726853265.88035: getting the remaining hosts for this loop 11683 1726853265.88036: done getting the remaining hosts for this loop 11683 1726853265.88040: getting the next task for host managed_node3 11683 1726853265.88049: done getting next task for host managed_node3 11683 1726853265.88051: ^ task is: TASK: Get NM profile info 11683 1726853265.88056: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853265.88060: getting variables 11683 1726853265.88062: in VariableManager get_vars() 11683 1726853265.88122: Calling all_inventory to load vars for managed_node3 11683 1726853265.88125: Calling groups_inventory to load vars for managed_node3 11683 1726853265.88127: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853265.88140: Calling all_plugins_play to load vars for managed_node3 11683 1726853265.88145: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853265.88148: Calling groups_plugins_play to load vars for managed_node3 11683 1726853265.88934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853265.89812: done with get_vars() 11683 1726853265.89832: done getting variables 11683 1726853265.89880: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:27:45 -0400 (0:00:00.031) 0:00:18.971 ****** 11683 1726853265.89904: entering _queue_task() for managed_node3/shell 11683 1726853265.90478: worker is 1 (out of 1 available) 11683 1726853265.90489: exiting _queue_task() for managed_node3/shell 11683 1726853265.90500: done queuing things up, now waiting for results queue to drain 11683 1726853265.90501: waiting for pending results... 11683 1726853265.90690: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11683 1726853265.90759: in run() - task 02083763-bbaf-c5b2-e075-0000000003fb 11683 1726853265.90783: variable 'ansible_search_path' from source: unknown 11683 1726853265.90790: variable 'ansible_search_path' from source: unknown 11683 1726853265.90830: calling self._execute() 11683 1726853265.90928: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.90979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.90998: variable 'omit' from source: magic vars 11683 1726853265.91413: variable 'ansible_distribution_major_version' from source: facts 11683 1726853265.91426: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853265.91430: variable 'omit' from source: magic vars 11683 1726853265.91466: variable 'omit' from source: magic vars 11683 1726853265.91546: variable 'profile' from source: include params 11683 1726853265.91550: variable 'item' from source: include params 11683 1726853265.91598: variable 'item' from source: include params 11683 1726853265.91612: variable 'omit' from source: magic vars 11683 1726853265.91646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853265.91677: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853265.91697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853265.91709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.91720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853265.91743: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853265.91749: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.91752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.91826: Set connection var ansible_shell_executable to /bin/sh 11683 1726853265.91835: Set connection var ansible_timeout to 10 11683 1726853265.91842: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853265.91848: Set connection var ansible_pipelining to False 11683 1726853265.91851: Set connection var ansible_shell_type to sh 11683 1726853265.91854: Set connection var ansible_connection to ssh 11683 1726853265.91873: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.91876: variable 'ansible_connection' from source: unknown 11683 1726853265.91879: variable 'ansible_module_compression' from source: unknown 11683 1726853265.91881: variable 'ansible_shell_type' from source: unknown 11683 1726853265.91883: variable 'ansible_shell_executable' from source: unknown 11683 1726853265.91886: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853265.91888: variable 'ansible_pipelining' from source: unknown 11683 1726853265.91891: variable 'ansible_timeout' from source: unknown 11683 1726853265.91895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853265.92002: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853265.92011: variable 'omit' from source: magic vars 11683 1726853265.92017: starting attempt loop 11683 1726853265.92020: running the handler 11683 1726853265.92028: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853265.92044: _low_level_execute_command(): starting 11683 1726853265.92053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853265.92558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853265.92594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.92598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853265.92601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.92656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853265.92664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853265.92667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853265.92731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853265.94467: stdout chunk (state=3): >>>/root <<< 11683 1726853265.94753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853265.94757: stdout chunk (state=3): >>><<< 11683 1726853265.94761: stderr chunk (state=3): >>><<< 11683 1726853265.94765: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853265.94767: _low_level_execute_command(): starting 11683 1726853265.94773: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712 `" && echo ansible-tmp-1726853265.9465208-12469-239597676293712="` echo /root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712 `" ) && sleep 0' 11683 1726853265.95377: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853265.95393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853265.95413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853265.95431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853265.95450: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853265.95461: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853265.95531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.95575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853265.95595: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853265.95613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853265.95702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853265.97702: stdout chunk (state=3): >>>ansible-tmp-1726853265.9465208-12469-239597676293712=/root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712 <<< 11683 1726853265.97878: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853265.97882: stdout chunk (state=3): >>><<< 11683 1726853265.97885: stderr chunk (state=3): >>><<< 11683 1726853265.97905: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853265.9465208-12469-239597676293712=/root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853265.97949: variable 'ansible_module_compression' from source: unknown 11683 1726853265.98064: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853265.98067: variable 'ansible_facts' from source: unknown 11683 1726853265.98166: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/AnsiballZ_command.py 11683 1726853265.98400: Sending initial data 11683 1726853265.98403: Sent initial data (156 bytes) 11683 1726853265.99031: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853265.99111: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853265.99198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853265.99270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853265.99395: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853265.99612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853265.99798: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853266.01578: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853266.01632: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853266.01697: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpuaae31jy /root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/AnsiballZ_command.py <<< 11683 1726853266.01708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/AnsiballZ_command.py" <<< 11683 1726853266.01755: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpuaae31jy" to remote "/root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/AnsiballZ_command.py" <<< 11683 1726853266.03205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853266.03270: stderr chunk (state=3): >>><<< 11683 1726853266.03282: stdout chunk (state=3): >>><<< 11683 1726853266.03629: done transferring module to remote 11683 1726853266.03633: _low_level_execute_command(): starting 11683 1726853266.03635: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/ /root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/AnsiballZ_command.py && sleep 0' 11683 1726853266.04640: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853266.04688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853266.05128: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853266.05196: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853266.05304: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853266.07308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853266.07340: stderr chunk (state=3): >>><<< 11683 1726853266.07357: stdout chunk (state=3): >>><<< 11683 1726853266.07564: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853266.07573: _low_level_execute_command(): starting 11683 1726853266.07577: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/AnsiballZ_command.py && sleep 0' 11683 1726853266.08500: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853266.08504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853266.08516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853266.08669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853266.08685: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853266.26287: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:27:46.240742", "end": "2024-09-20 13:27:46.261638", "delta": "0:00:00.020896", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853266.27919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853266.27952: stderr chunk (state=3): >>><<< 11683 1726853266.27954: stdout chunk (state=3): >>><<< 11683 1726853266.27968: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 13:27:46.240742", "end": "2024-09-20 13:27:46.261638", "delta": "0:00:00.020896", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853266.28005: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853266.28013: _low_level_execute_command(): starting 11683 1726853266.28017: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853265.9465208-12469-239597676293712/ > /dev/null 2>&1 && sleep 0' 11683 1726853266.28562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853266.28605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853266.28670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853266.30519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853266.30546: stderr chunk (state=3): >>><<< 11683 1726853266.30549: stdout chunk (state=3): >>><<< 11683 1726853266.30560: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853266.30576: handler run complete 11683 1726853266.30591: Evaluated conditional (False): False 11683 1726853266.30599: attempt loop complete, returning result 11683 1726853266.30602: _execute() done 11683 1726853266.30605: dumping result to json 11683 1726853266.30609: done dumping result, returning 11683 1726853266.30617: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-c5b2-e075-0000000003fb] 11683 1726853266.30621: sending task result for task 02083763-bbaf-c5b2-e075-0000000003fb 11683 1726853266.30715: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003fb 11683 1726853266.30718: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.020896", "end": "2024-09-20 13:27:46.261638", "rc": 0, "start": "2024-09-20 13:27:46.240742" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11683 1726853266.30791: no more pending results, returning what we have 11683 1726853266.30795: results queue empty 11683 1726853266.30796: checking for any_errors_fatal 11683 1726853266.30800: done checking for any_errors_fatal 11683 1726853266.30800: checking for max_fail_percentage 11683 1726853266.30802: done checking for max_fail_percentage 11683 1726853266.30803: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.30804: done checking to see if all hosts have failed 11683 1726853266.30804: getting the remaining hosts for this loop 11683 1726853266.30806: done getting the remaining hosts for this loop 11683 1726853266.30809: getting the next task for host managed_node3 11683 1726853266.30815: done getting next task for host managed_node3 11683 1726853266.30817: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11683 1726853266.30822: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.30825: getting variables 11683 1726853266.30833: in VariableManager get_vars() 11683 1726853266.30877: Calling all_inventory to load vars for managed_node3 11683 1726853266.30880: Calling groups_inventory to load vars for managed_node3 11683 1726853266.30882: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.30892: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.30894: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.30897: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.32181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.33754: done with get_vars() 11683 1726853266.33779: done getting variables 11683 1726853266.33839: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:27:46 -0400 (0:00:00.439) 0:00:19.410 ****** 11683 1726853266.33874: entering _queue_task() for managed_node3/set_fact 11683 1726853266.34222: worker is 1 (out of 1 available) 11683 1726853266.34235: exiting _queue_task() for managed_node3/set_fact 11683 1726853266.34249: done queuing things up, now waiting for results queue to drain 11683 1726853266.34251: waiting for pending results... 11683 1726853266.34693: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11683 1726853266.34699: in run() - task 02083763-bbaf-c5b2-e075-0000000003fc 11683 1726853266.34709: variable 'ansible_search_path' from source: unknown 11683 1726853266.34716: variable 'ansible_search_path' from source: unknown 11683 1726853266.34758: calling self._execute() 11683 1726853266.34864: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.34878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.34900: variable 'omit' from source: magic vars 11683 1726853266.35305: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.35473: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.35478: variable 'nm_profile_exists' from source: set_fact 11683 1726853266.35491: Evaluated conditional (nm_profile_exists.rc == 0): True 11683 1726853266.35501: variable 'omit' from source: magic vars 11683 1726853266.35557: variable 'omit' from source: magic vars 11683 1726853266.35602: variable 'omit' from source: magic vars 11683 1726853266.35651: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853266.35699: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853266.35730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853266.35756: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.35815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.35818: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853266.35822: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.35828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.35937: Set connection var ansible_shell_executable to /bin/sh 11683 1726853266.35957: Set connection var ansible_timeout to 10 11683 1726853266.35970: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853266.35984: Set connection var ansible_pipelining to False 11683 1726853266.36032: Set connection var ansible_shell_type to sh 11683 1726853266.36035: Set connection var ansible_connection to ssh 11683 1726853266.36038: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.36040: variable 'ansible_connection' from source: unknown 11683 1726853266.36045: variable 'ansible_module_compression' from source: unknown 11683 1726853266.36047: variable 'ansible_shell_type' from source: unknown 11683 1726853266.36049: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.36051: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.36059: variable 'ansible_pipelining' from source: unknown 11683 1726853266.36068: variable 'ansible_timeout' from source: unknown 11683 1726853266.36078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.36233: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853266.36362: variable 'omit' from source: magic vars 11683 1726853266.36365: starting attempt loop 11683 1726853266.36368: running the handler 11683 1726853266.36372: handler run complete 11683 1726853266.36375: attempt loop complete, returning result 11683 1726853266.36377: _execute() done 11683 1726853266.36379: dumping result to json 11683 1726853266.36381: done dumping result, returning 11683 1726853266.36383: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-c5b2-e075-0000000003fc] 11683 1726853266.36385: sending task result for task 02083763-bbaf-c5b2-e075-0000000003fc 11683 1726853266.36456: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003fc 11683 1726853266.36459: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11683 1726853266.36547: no more pending results, returning what we have 11683 1726853266.36551: results queue empty 11683 1726853266.36552: checking for any_errors_fatal 11683 1726853266.36560: done checking for any_errors_fatal 11683 1726853266.36561: checking for max_fail_percentage 11683 1726853266.36563: done checking for max_fail_percentage 11683 1726853266.36564: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.36565: done checking to see if all hosts have failed 11683 1726853266.36566: getting the remaining hosts for this loop 11683 1726853266.36568: done getting the remaining hosts for this loop 11683 1726853266.36573: getting the next task for host managed_node3 11683 1726853266.36584: done getting next task for host managed_node3 11683 1726853266.36588: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11683 1726853266.36593: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.36598: getting variables 11683 1726853266.36600: in VariableManager get_vars() 11683 1726853266.36647: Calling all_inventory to load vars for managed_node3 11683 1726853266.36650: Calling groups_inventory to load vars for managed_node3 11683 1726853266.36653: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.36665: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.36669: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.36878: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.38303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.41747: done with get_vars() 11683 1726853266.41779: done getting variables 11683 1726853266.42112: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853266.42250: variable 'profile' from source: include params 11683 1726853266.42254: variable 'item' from source: include params 11683 1726853266.42436: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:27:46 -0400 (0:00:00.085) 0:00:19.496 ****** 11683 1726853266.42509: entering _queue_task() for managed_node3/command 11683 1726853266.43319: worker is 1 (out of 1 available) 11683 1726853266.43331: exiting _queue_task() for managed_node3/command 11683 1726853266.43342: done queuing things up, now waiting for results queue to drain 11683 1726853266.43343: waiting for pending results... 11683 1726853266.43859: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11683 1726853266.44278: in run() - task 02083763-bbaf-c5b2-e075-0000000003fe 11683 1726853266.44282: variable 'ansible_search_path' from source: unknown 11683 1726853266.44285: variable 'ansible_search_path' from source: unknown 11683 1726853266.44288: calling self._execute() 11683 1726853266.44290: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.44387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.44402: variable 'omit' from source: magic vars 11683 1726853266.45778: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.45783: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.45787: variable 'profile_stat' from source: set_fact 11683 1726853266.45789: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853266.45792: when evaluation is False, skipping this task 11683 1726853266.45796: _execute() done 11683 1726853266.45799: dumping result to json 11683 1726853266.45801: done dumping result, returning 11683 1726853266.45806: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [02083763-bbaf-c5b2-e075-0000000003fe] 11683 1726853266.46479: sending task result for task 02083763-bbaf-c5b2-e075-0000000003fe 11683 1726853266.46558: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003fe 11683 1726853266.46563: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853266.46622: no more pending results, returning what we have 11683 1726853266.46626: results queue empty 11683 1726853266.46627: checking for any_errors_fatal 11683 1726853266.46633: done checking for any_errors_fatal 11683 1726853266.46633: checking for max_fail_percentage 11683 1726853266.46635: done checking for max_fail_percentage 11683 1726853266.46636: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.46637: done checking to see if all hosts have failed 11683 1726853266.46638: getting the remaining hosts for this loop 11683 1726853266.46639: done getting the remaining hosts for this loop 11683 1726853266.46642: getting the next task for host managed_node3 11683 1726853266.46650: done getting next task for host managed_node3 11683 1726853266.46653: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11683 1726853266.46658: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.46662: getting variables 11683 1726853266.46664: in VariableManager get_vars() 11683 1726853266.46710: Calling all_inventory to load vars for managed_node3 11683 1726853266.46713: Calling groups_inventory to load vars for managed_node3 11683 1726853266.46715: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.46728: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.46731: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.46733: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.50051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.53581: done with get_vars() 11683 1726853266.53727: done getting variables 11683 1726853266.53816: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853266.53978: variable 'profile' from source: include params 11683 1726853266.53982: variable 'item' from source: include params 11683 1726853266.54049: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:27:46 -0400 (0:00:00.115) 0:00:19.612 ****** 11683 1726853266.54084: entering _queue_task() for managed_node3/set_fact 11683 1726853266.54463: worker is 1 (out of 1 available) 11683 1726853266.54477: exiting _queue_task() for managed_node3/set_fact 11683 1726853266.54494: done queuing things up, now waiting for results queue to drain 11683 1726853266.54496: waiting for pending results... 11683 1726853266.54768: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11683 1726853266.54901: in run() - task 02083763-bbaf-c5b2-e075-0000000003ff 11683 1726853266.54927: variable 'ansible_search_path' from source: unknown 11683 1726853266.54936: variable 'ansible_search_path' from source: unknown 11683 1726853266.54978: calling self._execute() 11683 1726853266.55083: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.55094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.55114: variable 'omit' from source: magic vars 11683 1726853266.55511: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.55527: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.55668: variable 'profile_stat' from source: set_fact 11683 1726853266.55701: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853266.55726: when evaluation is False, skipping this task 11683 1726853266.55734: _execute() done 11683 1726853266.55741: dumping result to json 11683 1726853266.55749: done dumping result, returning 11683 1726853266.55760: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [02083763-bbaf-c5b2-e075-0000000003ff] 11683 1726853266.55769: sending task result for task 02083763-bbaf-c5b2-e075-0000000003ff skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853266.55921: no more pending results, returning what we have 11683 1726853266.55925: results queue empty 11683 1726853266.55926: checking for any_errors_fatal 11683 1726853266.55932: done checking for any_errors_fatal 11683 1726853266.55933: checking for max_fail_percentage 11683 1726853266.55935: done checking for max_fail_percentage 11683 1726853266.55935: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.55936: done checking to see if all hosts have failed 11683 1726853266.55937: getting the remaining hosts for this loop 11683 1726853266.55938: done getting the remaining hosts for this loop 11683 1726853266.55941: getting the next task for host managed_node3 11683 1726853266.55950: done getting next task for host managed_node3 11683 1726853266.55952: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11683 1726853266.55957: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.55962: getting variables 11683 1726853266.55964: in VariableManager get_vars() 11683 1726853266.56008: Calling all_inventory to load vars for managed_node3 11683 1726853266.56010: Calling groups_inventory to load vars for managed_node3 11683 1726853266.56012: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.56019: done sending task result for task 02083763-bbaf-c5b2-e075-0000000003ff 11683 1726853266.56022: WORKER PROCESS EXITING 11683 1726853266.56137: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.56140: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.56146: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.57995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.60383: done with get_vars() 11683 1726853266.60406: done getting variables 11683 1726853266.60591: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853266.60815: variable 'profile' from source: include params 11683 1726853266.60819: variable 'item' from source: include params 11683 1726853266.60982: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:27:46 -0400 (0:00:00.069) 0:00:19.682 ****** 11683 1726853266.61021: entering _queue_task() for managed_node3/command 11683 1726853266.61723: worker is 1 (out of 1 available) 11683 1726853266.61737: exiting _queue_task() for managed_node3/command 11683 1726853266.61751: done queuing things up, now waiting for results queue to drain 11683 1726853266.61756: waiting for pending results... 11683 1726853266.61998: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 11683 1726853266.62113: in run() - task 02083763-bbaf-c5b2-e075-000000000400 11683 1726853266.62128: variable 'ansible_search_path' from source: unknown 11683 1726853266.62132: variable 'ansible_search_path' from source: unknown 11683 1726853266.62168: calling self._execute() 11683 1726853266.62264: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.62302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.62306: variable 'omit' from source: magic vars 11683 1726853266.62682: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.62694: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.62813: variable 'profile_stat' from source: set_fact 11683 1726853266.62850: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853266.62854: when evaluation is False, skipping this task 11683 1726853266.62857: _execute() done 11683 1726853266.62859: dumping result to json 11683 1726853266.62861: done dumping result, returning 11683 1726853266.62864: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [02083763-bbaf-c5b2-e075-000000000400] 11683 1726853266.62866: sending task result for task 02083763-bbaf-c5b2-e075-000000000400 11683 1726853266.63011: done sending task result for task 02083763-bbaf-c5b2-e075-000000000400 11683 1726853266.63013: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853266.63110: no more pending results, returning what we have 11683 1726853266.63114: results queue empty 11683 1726853266.63115: checking for any_errors_fatal 11683 1726853266.63123: done checking for any_errors_fatal 11683 1726853266.63124: checking for max_fail_percentage 11683 1726853266.63127: done checking for max_fail_percentage 11683 1726853266.63128: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.63129: done checking to see if all hosts have failed 11683 1726853266.63130: getting the remaining hosts for this loop 11683 1726853266.63131: done getting the remaining hosts for this loop 11683 1726853266.63135: getting the next task for host managed_node3 11683 1726853266.63145: done getting next task for host managed_node3 11683 1726853266.63148: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11683 1726853266.63153: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.63159: getting variables 11683 1726853266.63160: in VariableManager get_vars() 11683 1726853266.63315: Calling all_inventory to load vars for managed_node3 11683 1726853266.63318: Calling groups_inventory to load vars for managed_node3 11683 1726853266.63320: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.63330: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.63333: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.63335: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.64807: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.66277: done with get_vars() 11683 1726853266.66295: done getting variables 11683 1726853266.66339: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853266.66426: variable 'profile' from source: include params 11683 1726853266.66429: variable 'item' from source: include params 11683 1726853266.66470: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:27:46 -0400 (0:00:00.054) 0:00:19.736 ****** 11683 1726853266.66496: entering _queue_task() for managed_node3/set_fact 11683 1726853266.66793: worker is 1 (out of 1 available) 11683 1726853266.66805: exiting _queue_task() for managed_node3/set_fact 11683 1726853266.66816: done queuing things up, now waiting for results queue to drain 11683 1726853266.66817: waiting for pending results... 11683 1726853266.67078: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11683 1726853266.67280: in run() - task 02083763-bbaf-c5b2-e075-000000000401 11683 1726853266.67284: variable 'ansible_search_path' from source: unknown 11683 1726853266.67288: variable 'ansible_search_path' from source: unknown 11683 1726853266.67291: calling self._execute() 11683 1726853266.67294: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.67296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.67300: variable 'omit' from source: magic vars 11683 1726853266.67663: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.67676: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.67940: variable 'profile_stat' from source: set_fact 11683 1726853266.67975: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853266.67985: when evaluation is False, skipping this task 11683 1726853266.67992: _execute() done 11683 1726853266.67999: dumping result to json 11683 1726853266.68006: done dumping result, returning 11683 1726853266.68016: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [02083763-bbaf-c5b2-e075-000000000401] 11683 1726853266.68032: sending task result for task 02083763-bbaf-c5b2-e075-000000000401 11683 1726853266.68245: done sending task result for task 02083763-bbaf-c5b2-e075-000000000401 11683 1726853266.68249: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853266.68337: no more pending results, returning what we have 11683 1726853266.68342: results queue empty 11683 1726853266.68343: checking for any_errors_fatal 11683 1726853266.68349: done checking for any_errors_fatal 11683 1726853266.68350: checking for max_fail_percentage 11683 1726853266.68351: done checking for max_fail_percentage 11683 1726853266.68352: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.68353: done checking to see if all hosts have failed 11683 1726853266.68354: getting the remaining hosts for this loop 11683 1726853266.68355: done getting the remaining hosts for this loop 11683 1726853266.68389: getting the next task for host managed_node3 11683 1726853266.68399: done getting next task for host managed_node3 11683 1726853266.68402: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11683 1726853266.68405: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.68410: getting variables 11683 1726853266.68412: in VariableManager get_vars() 11683 1726853266.68455: Calling all_inventory to load vars for managed_node3 11683 1726853266.68458: Calling groups_inventory to load vars for managed_node3 11683 1726853266.68460: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.68588: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.68592: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.68596: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.70151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.71011: done with get_vars() 11683 1726853266.71032: done getting variables 11683 1726853266.71081: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853266.71172: variable 'profile' from source: include params 11683 1726853266.71176: variable 'item' from source: include params 11683 1726853266.71216: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:27:46 -0400 (0:00:00.047) 0:00:19.784 ****** 11683 1726853266.71239: entering _queue_task() for managed_node3/assert 11683 1726853266.71500: worker is 1 (out of 1 available) 11683 1726853266.71515: exiting _queue_task() for managed_node3/assert 11683 1726853266.71527: done queuing things up, now waiting for results queue to drain 11683 1726853266.71528: waiting for pending results... 11683 1726853266.71824: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' 11683 1726853266.71830: in run() - task 02083763-bbaf-c5b2-e075-000000000267 11683 1726853266.71833: variable 'ansible_search_path' from source: unknown 11683 1726853266.71835: variable 'ansible_search_path' from source: unknown 11683 1726853266.71916: calling self._execute() 11683 1726853266.71987: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.71991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.72190: variable 'omit' from source: magic vars 11683 1726853266.72367: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.72381: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.72387: variable 'omit' from source: magic vars 11683 1726853266.72431: variable 'omit' from source: magic vars 11683 1726853266.72526: variable 'profile' from source: include params 11683 1726853266.72530: variable 'item' from source: include params 11683 1726853266.72603: variable 'item' from source: include params 11683 1726853266.72630: variable 'omit' from source: magic vars 11683 1726853266.72675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853266.72726: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853266.72781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853266.72786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.72789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.72848: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853266.72852: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.72855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.72945: Set connection var ansible_shell_executable to /bin/sh 11683 1726853266.72959: Set connection var ansible_timeout to 10 11683 1726853266.72965: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853266.72970: Set connection var ansible_pipelining to False 11683 1726853266.72975: Set connection var ansible_shell_type to sh 11683 1726853266.72978: Set connection var ansible_connection to ssh 11683 1726853266.72995: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.72998: variable 'ansible_connection' from source: unknown 11683 1726853266.73000: variable 'ansible_module_compression' from source: unknown 11683 1726853266.73003: variable 'ansible_shell_type' from source: unknown 11683 1726853266.73005: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.73008: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.73010: variable 'ansible_pipelining' from source: unknown 11683 1726853266.73013: variable 'ansible_timeout' from source: unknown 11683 1726853266.73017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.73123: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853266.73132: variable 'omit' from source: magic vars 11683 1726853266.73137: starting attempt loop 11683 1726853266.73140: running the handler 11683 1726853266.73226: variable 'lsr_net_profile_exists' from source: set_fact 11683 1726853266.73229: Evaluated conditional (lsr_net_profile_exists): True 11683 1726853266.73235: handler run complete 11683 1726853266.73249: attempt loop complete, returning result 11683 1726853266.73252: _execute() done 11683 1726853266.73255: dumping result to json 11683 1726853266.73257: done dumping result, returning 11683 1726853266.73263: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.0' [02083763-bbaf-c5b2-e075-000000000267] 11683 1726853266.73268: sending task result for task 02083763-bbaf-c5b2-e075-000000000267 11683 1726853266.73349: done sending task result for task 02083763-bbaf-c5b2-e075-000000000267 11683 1726853266.73352: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853266.73426: no more pending results, returning what we have 11683 1726853266.73429: results queue empty 11683 1726853266.73430: checking for any_errors_fatal 11683 1726853266.73436: done checking for any_errors_fatal 11683 1726853266.73437: checking for max_fail_percentage 11683 1726853266.73439: done checking for max_fail_percentage 11683 1726853266.73440: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.73441: done checking to see if all hosts have failed 11683 1726853266.73442: getting the remaining hosts for this loop 11683 1726853266.73443: done getting the remaining hosts for this loop 11683 1726853266.73446: getting the next task for host managed_node3 11683 1726853266.73452: done getting next task for host managed_node3 11683 1726853266.73455: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11683 1726853266.73457: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.73462: getting variables 11683 1726853266.73463: in VariableManager get_vars() 11683 1726853266.73509: Calling all_inventory to load vars for managed_node3 11683 1726853266.73511: Calling groups_inventory to load vars for managed_node3 11683 1726853266.73514: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.73524: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.73526: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.73528: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.74313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.75176: done with get_vars() 11683 1726853266.75192: done getting variables 11683 1726853266.75236: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853266.75322: variable 'profile' from source: include params 11683 1726853266.75325: variable 'item' from source: include params 11683 1726853266.75365: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:27:46 -0400 (0:00:00.041) 0:00:19.825 ****** 11683 1726853266.75395: entering _queue_task() for managed_node3/assert 11683 1726853266.75641: worker is 1 (out of 1 available) 11683 1726853266.75654: exiting _queue_task() for managed_node3/assert 11683 1726853266.75666: done queuing things up, now waiting for results queue to drain 11683 1726853266.75667: waiting for pending results... 11683 1726853266.75852: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11683 1726853266.75924: in run() - task 02083763-bbaf-c5b2-e075-000000000268 11683 1726853266.75937: variable 'ansible_search_path' from source: unknown 11683 1726853266.75940: variable 'ansible_search_path' from source: unknown 11683 1726853266.75969: calling self._execute() 11683 1726853266.76048: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.76054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.76063: variable 'omit' from source: magic vars 11683 1726853266.76343: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.76355: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.76361: variable 'omit' from source: magic vars 11683 1726853266.76389: variable 'omit' from source: magic vars 11683 1726853266.76460: variable 'profile' from source: include params 11683 1726853266.76464: variable 'item' from source: include params 11683 1726853266.76510: variable 'item' from source: include params 11683 1726853266.76524: variable 'omit' from source: magic vars 11683 1726853266.76560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853266.76589: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853266.76606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853266.76620: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.76629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.76659: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853266.76662: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.76665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.76732: Set connection var ansible_shell_executable to /bin/sh 11683 1726853266.76740: Set connection var ansible_timeout to 10 11683 1726853266.76749: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853266.76754: Set connection var ansible_pipelining to False 11683 1726853266.76757: Set connection var ansible_shell_type to sh 11683 1726853266.76761: Set connection var ansible_connection to ssh 11683 1726853266.76780: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.76783: variable 'ansible_connection' from source: unknown 11683 1726853266.76785: variable 'ansible_module_compression' from source: unknown 11683 1726853266.76788: variable 'ansible_shell_type' from source: unknown 11683 1726853266.76790: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.76792: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.76796: variable 'ansible_pipelining' from source: unknown 11683 1726853266.76799: variable 'ansible_timeout' from source: unknown 11683 1726853266.76802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.76907: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853266.76916: variable 'omit' from source: magic vars 11683 1726853266.76921: starting attempt loop 11683 1726853266.76924: running the handler 11683 1726853266.77002: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11683 1726853266.77006: Evaluated conditional (lsr_net_profile_ansible_managed): True 11683 1726853266.77012: handler run complete 11683 1726853266.77023: attempt loop complete, returning result 11683 1726853266.77026: _execute() done 11683 1726853266.77028: dumping result to json 11683 1726853266.77031: done dumping result, returning 11683 1726853266.77038: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [02083763-bbaf-c5b2-e075-000000000268] 11683 1726853266.77043: sending task result for task 02083763-bbaf-c5b2-e075-000000000268 11683 1726853266.77125: done sending task result for task 02083763-bbaf-c5b2-e075-000000000268 11683 1726853266.77127: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853266.77177: no more pending results, returning what we have 11683 1726853266.77181: results queue empty 11683 1726853266.77182: checking for any_errors_fatal 11683 1726853266.77188: done checking for any_errors_fatal 11683 1726853266.77188: checking for max_fail_percentage 11683 1726853266.77190: done checking for max_fail_percentage 11683 1726853266.77191: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.77193: done checking to see if all hosts have failed 11683 1726853266.77193: getting the remaining hosts for this loop 11683 1726853266.77195: done getting the remaining hosts for this loop 11683 1726853266.77197: getting the next task for host managed_node3 11683 1726853266.77203: done getting next task for host managed_node3 11683 1726853266.77206: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11683 1726853266.77208: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.77213: getting variables 11683 1726853266.77214: in VariableManager get_vars() 11683 1726853266.77256: Calling all_inventory to load vars for managed_node3 11683 1726853266.77259: Calling groups_inventory to load vars for managed_node3 11683 1726853266.77261: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.77280: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.77284: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.77287: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.78188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.79034: done with get_vars() 11683 1726853266.79049: done getting variables 11683 1726853266.79091: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853266.79168: variable 'profile' from source: include params 11683 1726853266.79173: variable 'item' from source: include params 11683 1726853266.79212: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:27:46 -0400 (0:00:00.038) 0:00:19.864 ****** 11683 1726853266.79239: entering _queue_task() for managed_node3/assert 11683 1726853266.79466: worker is 1 (out of 1 available) 11683 1726853266.79483: exiting _queue_task() for managed_node3/assert 11683 1726853266.79494: done queuing things up, now waiting for results queue to drain 11683 1726853266.79495: waiting for pending results... 11683 1726853266.79674: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 11683 1726853266.79755: in run() - task 02083763-bbaf-c5b2-e075-000000000269 11683 1726853266.79766: variable 'ansible_search_path' from source: unknown 11683 1726853266.79770: variable 'ansible_search_path' from source: unknown 11683 1726853266.79800: calling self._execute() 11683 1726853266.79876: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.79880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.79888: variable 'omit' from source: magic vars 11683 1726853266.80153: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.80164: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.80167: variable 'omit' from source: magic vars 11683 1726853266.80196: variable 'omit' from source: magic vars 11683 1726853266.80267: variable 'profile' from source: include params 11683 1726853266.80277: variable 'item' from source: include params 11683 1726853266.80317: variable 'item' from source: include params 11683 1726853266.80330: variable 'omit' from source: magic vars 11683 1726853266.80364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853266.80395: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853266.80412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853266.80425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.80435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.80461: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853266.80464: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.80467: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.80537: Set connection var ansible_shell_executable to /bin/sh 11683 1726853266.80548: Set connection var ansible_timeout to 10 11683 1726853266.80555: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853266.80559: Set connection var ansible_pipelining to False 11683 1726853266.80562: Set connection var ansible_shell_type to sh 11683 1726853266.80564: Set connection var ansible_connection to ssh 11683 1726853266.80582: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.80585: variable 'ansible_connection' from source: unknown 11683 1726853266.80587: variable 'ansible_module_compression' from source: unknown 11683 1726853266.80590: variable 'ansible_shell_type' from source: unknown 11683 1726853266.80592: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.80594: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.80606: variable 'ansible_pipelining' from source: unknown 11683 1726853266.80610: variable 'ansible_timeout' from source: unknown 11683 1726853266.80612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.80716: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853266.80720: variable 'omit' from source: magic vars 11683 1726853266.80724: starting attempt loop 11683 1726853266.80727: running the handler 11683 1726853266.80801: variable 'lsr_net_profile_fingerprint' from source: set_fact 11683 1726853266.80804: Evaluated conditional (lsr_net_profile_fingerprint): True 11683 1726853266.80811: handler run complete 11683 1726853266.80823: attempt loop complete, returning result 11683 1726853266.80826: _execute() done 11683 1726853266.80828: dumping result to json 11683 1726853266.80831: done dumping result, returning 11683 1726853266.80840: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.0 [02083763-bbaf-c5b2-e075-000000000269] 11683 1726853266.80843: sending task result for task 02083763-bbaf-c5b2-e075-000000000269 11683 1726853266.80920: done sending task result for task 02083763-bbaf-c5b2-e075-000000000269 11683 1726853266.80927: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853266.80989: no more pending results, returning what we have 11683 1726853266.80993: results queue empty 11683 1726853266.80993: checking for any_errors_fatal 11683 1726853266.80999: done checking for any_errors_fatal 11683 1726853266.81000: checking for max_fail_percentage 11683 1726853266.81002: done checking for max_fail_percentage 11683 1726853266.81003: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.81004: done checking to see if all hosts have failed 11683 1726853266.81005: getting the remaining hosts for this loop 11683 1726853266.81006: done getting the remaining hosts for this loop 11683 1726853266.81009: getting the next task for host managed_node3 11683 1726853266.81016: done getting next task for host managed_node3 11683 1726853266.81018: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11683 1726853266.81021: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.81025: getting variables 11683 1726853266.81026: in VariableManager get_vars() 11683 1726853266.81065: Calling all_inventory to load vars for managed_node3 11683 1726853266.81067: Calling groups_inventory to load vars for managed_node3 11683 1726853266.81069: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.81080: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.81082: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.81085: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.81833: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.82690: done with get_vars() 11683 1726853266.82705: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 13:27:46 -0400 (0:00:00.035) 0:00:19.899 ****** 11683 1726853266.82769: entering _queue_task() for managed_node3/include_tasks 11683 1726853266.83000: worker is 1 (out of 1 available) 11683 1726853266.83014: exiting _queue_task() for managed_node3/include_tasks 11683 1726853266.83026: done queuing things up, now waiting for results queue to drain 11683 1726853266.83028: waiting for pending results... 11683 1726853266.83208: running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' 11683 1726853266.83292: in run() - task 02083763-bbaf-c5b2-e075-00000000026d 11683 1726853266.83302: variable 'ansible_search_path' from source: unknown 11683 1726853266.83306: variable 'ansible_search_path' from source: unknown 11683 1726853266.83332: calling self._execute() 11683 1726853266.83407: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.83411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.83421: variable 'omit' from source: magic vars 11683 1726853266.83690: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.83701: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.83707: _execute() done 11683 1726853266.83710: dumping result to json 11683 1726853266.83712: done dumping result, returning 11683 1726853266.83718: done running TaskExecutor() for managed_node3/TASK: Include the task 'get_profile_stat.yml' [02083763-bbaf-c5b2-e075-00000000026d] 11683 1726853266.83723: sending task result for task 02083763-bbaf-c5b2-e075-00000000026d 11683 1726853266.83805: done sending task result for task 02083763-bbaf-c5b2-e075-00000000026d 11683 1726853266.83808: WORKER PROCESS EXITING 11683 1726853266.83834: no more pending results, returning what we have 11683 1726853266.83839: in VariableManager get_vars() 11683 1726853266.83886: Calling all_inventory to load vars for managed_node3 11683 1726853266.83889: Calling groups_inventory to load vars for managed_node3 11683 1726853266.83891: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.83904: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.83906: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.83909: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.84752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.85594: done with get_vars() 11683 1726853266.85609: variable 'ansible_search_path' from source: unknown 11683 1726853266.85610: variable 'ansible_search_path' from source: unknown 11683 1726853266.85633: we have included files to process 11683 1726853266.85634: generating all_blocks data 11683 1726853266.85635: done generating all_blocks data 11683 1726853266.85638: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11683 1726853266.85639: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11683 1726853266.85640: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11683 1726853266.86215: done processing included file 11683 1726853266.86217: iterating over new_blocks loaded from include file 11683 1726853266.86218: in VariableManager get_vars() 11683 1726853266.86232: done with get_vars() 11683 1726853266.86233: filtering new block on tags 11683 1726853266.86248: done filtering new block on tags 11683 1726853266.86250: in VariableManager get_vars() 11683 1726853266.86263: done with get_vars() 11683 1726853266.86265: filtering new block on tags 11683 1726853266.86279: done filtering new block on tags 11683 1726853266.86280: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node3 11683 1726853266.86284: extending task lists for all hosts with included blocks 11683 1726853266.86384: done extending task lists 11683 1726853266.86385: done processing included files 11683 1726853266.86386: results queue empty 11683 1726853266.86386: checking for any_errors_fatal 11683 1726853266.86388: done checking for any_errors_fatal 11683 1726853266.86389: checking for max_fail_percentage 11683 1726853266.86389: done checking for max_fail_percentage 11683 1726853266.86390: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.86390: done checking to see if all hosts have failed 11683 1726853266.86391: getting the remaining hosts for this loop 11683 1726853266.86391: done getting the remaining hosts for this loop 11683 1726853266.86393: getting the next task for host managed_node3 11683 1726853266.86396: done getting next task for host managed_node3 11683 1726853266.86397: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11683 1726853266.86399: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.86401: getting variables 11683 1726853266.86401: in VariableManager get_vars() 11683 1726853266.86410: Calling all_inventory to load vars for managed_node3 11683 1726853266.86411: Calling groups_inventory to load vars for managed_node3 11683 1726853266.86413: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.86416: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.86418: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.86419: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.87086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.87922: done with get_vars() 11683 1726853266.87936: done getting variables 11683 1726853266.87963: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 13:27:46 -0400 (0:00:00.052) 0:00:19.951 ****** 11683 1726853266.87986: entering _queue_task() for managed_node3/set_fact 11683 1726853266.88229: worker is 1 (out of 1 available) 11683 1726853266.88240: exiting _queue_task() for managed_node3/set_fact 11683 1726853266.88252: done queuing things up, now waiting for results queue to drain 11683 1726853266.88254: waiting for pending results... 11683 1726853266.88433: running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag 11683 1726853266.88516: in run() - task 02083763-bbaf-c5b2-e075-000000000440 11683 1726853266.88528: variable 'ansible_search_path' from source: unknown 11683 1726853266.88532: variable 'ansible_search_path' from source: unknown 11683 1726853266.88589: calling self._execute() 11683 1726853266.88657: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.88661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.88670: variable 'omit' from source: magic vars 11683 1726853266.88956: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.88967: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.88970: variable 'omit' from source: magic vars 11683 1726853266.89004: variable 'omit' from source: magic vars 11683 1726853266.89032: variable 'omit' from source: magic vars 11683 1726853266.89065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853266.89095: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853266.89110: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853266.89125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.89139: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.89160: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853266.89164: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.89166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.89232: Set connection var ansible_shell_executable to /bin/sh 11683 1726853266.89249: Set connection var ansible_timeout to 10 11683 1726853266.89252: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853266.89255: Set connection var ansible_pipelining to False 11683 1726853266.89257: Set connection var ansible_shell_type to sh 11683 1726853266.89259: Set connection var ansible_connection to ssh 11683 1726853266.89276: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.89279: variable 'ansible_connection' from source: unknown 11683 1726853266.89282: variable 'ansible_module_compression' from source: unknown 11683 1726853266.89284: variable 'ansible_shell_type' from source: unknown 11683 1726853266.89286: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.89290: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.89294: variable 'ansible_pipelining' from source: unknown 11683 1726853266.89296: variable 'ansible_timeout' from source: unknown 11683 1726853266.89300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.89402: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853266.89411: variable 'omit' from source: magic vars 11683 1726853266.89416: starting attempt loop 11683 1726853266.89419: running the handler 11683 1726853266.89430: handler run complete 11683 1726853266.89438: attempt loop complete, returning result 11683 1726853266.89441: _execute() done 11683 1726853266.89446: dumping result to json 11683 1726853266.89449: done dumping result, returning 11683 1726853266.89455: done running TaskExecutor() for managed_node3/TASK: Initialize NM profile exist and ansible_managed comment flag [02083763-bbaf-c5b2-e075-000000000440] 11683 1726853266.89458: sending task result for task 02083763-bbaf-c5b2-e075-000000000440 11683 1726853266.89533: done sending task result for task 02083763-bbaf-c5b2-e075-000000000440 11683 1726853266.89536: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11683 1726853266.89623: no more pending results, returning what we have 11683 1726853266.89626: results queue empty 11683 1726853266.89626: checking for any_errors_fatal 11683 1726853266.89628: done checking for any_errors_fatal 11683 1726853266.89628: checking for max_fail_percentage 11683 1726853266.89630: done checking for max_fail_percentage 11683 1726853266.89631: checking to see if all hosts have failed and the running result is not ok 11683 1726853266.89632: done checking to see if all hosts have failed 11683 1726853266.89633: getting the remaining hosts for this loop 11683 1726853266.89634: done getting the remaining hosts for this loop 11683 1726853266.89637: getting the next task for host managed_node3 11683 1726853266.89646: done getting next task for host managed_node3 11683 1726853266.89648: ^ task is: TASK: Stat profile file 11683 1726853266.89652: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853266.89655: getting variables 11683 1726853266.89656: in VariableManager get_vars() 11683 1726853266.89692: Calling all_inventory to load vars for managed_node3 11683 1726853266.89694: Calling groups_inventory to load vars for managed_node3 11683 1726853266.89696: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853266.89705: Calling all_plugins_play to load vars for managed_node3 11683 1726853266.89707: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853266.89709: Calling groups_plugins_play to load vars for managed_node3 11683 1726853266.90741: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853266.91873: done with get_vars() 11683 1726853266.91894: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 13:27:46 -0400 (0:00:00.039) 0:00:19.991 ****** 11683 1726853266.91960: entering _queue_task() for managed_node3/stat 11683 1726853266.92209: worker is 1 (out of 1 available) 11683 1726853266.92223: exiting _queue_task() for managed_node3/stat 11683 1726853266.92236: done queuing things up, now waiting for results queue to drain 11683 1726853266.92237: waiting for pending results... 11683 1726853266.92416: running TaskExecutor() for managed_node3/TASK: Stat profile file 11683 1726853266.92497: in run() - task 02083763-bbaf-c5b2-e075-000000000441 11683 1726853266.92508: variable 'ansible_search_path' from source: unknown 11683 1726853266.92512: variable 'ansible_search_path' from source: unknown 11683 1726853266.92539: calling self._execute() 11683 1726853266.92614: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.92618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.92628: variable 'omit' from source: magic vars 11683 1726853266.92902: variable 'ansible_distribution_major_version' from source: facts 11683 1726853266.92913: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853266.92918: variable 'omit' from source: magic vars 11683 1726853266.92952: variable 'omit' from source: magic vars 11683 1726853266.93078: variable 'profile' from source: include params 11683 1726853266.93081: variable 'item' from source: include params 11683 1726853266.93107: variable 'item' from source: include params 11683 1726853266.93126: variable 'omit' from source: magic vars 11683 1726853266.93168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853266.93246: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853266.93356: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853266.93359: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.93362: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853266.93364: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853266.93366: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.93368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.93504: Set connection var ansible_shell_executable to /bin/sh 11683 1726853266.93514: Set connection var ansible_timeout to 10 11683 1726853266.93521: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853266.93526: Set connection var ansible_pipelining to False 11683 1726853266.93528: Set connection var ansible_shell_type to sh 11683 1726853266.93531: Set connection var ansible_connection to ssh 11683 1726853266.93595: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.93598: variable 'ansible_connection' from source: unknown 11683 1726853266.93601: variable 'ansible_module_compression' from source: unknown 11683 1726853266.93603: variable 'ansible_shell_type' from source: unknown 11683 1726853266.93605: variable 'ansible_shell_executable' from source: unknown 11683 1726853266.93606: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853266.93608: variable 'ansible_pipelining' from source: unknown 11683 1726853266.93611: variable 'ansible_timeout' from source: unknown 11683 1726853266.93613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853266.93776: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853266.93780: variable 'omit' from source: magic vars 11683 1726853266.93783: starting attempt loop 11683 1726853266.93785: running the handler 11683 1726853266.93787: _low_level_execute_command(): starting 11683 1726853266.93814: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853266.94578: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853266.94592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853266.94699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853266.96445: stdout chunk (state=3): >>>/root <<< 11683 1726853266.96577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853266.96580: stdout chunk (state=3): >>><<< 11683 1726853266.96584: stderr chunk (state=3): >>><<< 11683 1726853266.96586: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853266.96590: _low_level_execute_command(): starting 11683 1726853266.96601: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109 `" && echo ansible-tmp-1726853266.96578-12522-174875167039109="` echo /root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109 `" ) && sleep 0' 11683 1726853266.97194: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853266.97241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853266.97244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853266.97247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853266.97249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853266.97251: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853266.97261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853266.97276: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853266.97280: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853266.97304: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11683 1726853266.97307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853266.97309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853266.97312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853266.97328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853266.97331: stderr chunk (state=3): >>>debug2: match found <<< 11683 1726853266.97333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853266.97442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853266.97446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853266.97448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853266.97523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853266.99516: stdout chunk (state=3): >>>ansible-tmp-1726853266.96578-12522-174875167039109=/root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109 <<< 11683 1726853266.99624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853266.99655: stderr chunk (state=3): >>><<< 11683 1726853266.99657: stdout chunk (state=3): >>><<< 11683 1726853266.99673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853266.96578-12522-174875167039109=/root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853266.99715: variable 'ansible_module_compression' from source: unknown 11683 1726853266.99760: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11683 1726853266.99794: variable 'ansible_facts' from source: unknown 11683 1726853266.99844: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/AnsiballZ_stat.py 11683 1726853266.99953: Sending initial data 11683 1726853266.99957: Sent initial data (151 bytes) 11683 1726853267.00553: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853267.00599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.00665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.02302: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11683 1726853267.02306: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853267.02358: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853267.02420: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpvirstg9p /root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/AnsiballZ_stat.py <<< 11683 1726853267.02423: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/AnsiballZ_stat.py" <<< 11683 1726853267.02476: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpvirstg9p" to remote "/root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/AnsiballZ_stat.py" <<< 11683 1726853267.02480: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/AnsiballZ_stat.py" <<< 11683 1726853267.03136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853267.03319: stderr chunk (state=3): >>><<< 11683 1726853267.03322: stdout chunk (state=3): >>><<< 11683 1726853267.03325: done transferring module to remote 11683 1726853267.03327: _low_level_execute_command(): starting 11683 1726853267.03329: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/ /root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/AnsiballZ_stat.py && sleep 0' 11683 1726853267.03870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853267.03891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853267.03924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853267.03992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853267.04035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853267.04040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853267.04136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853267.04155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.04212: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.06282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853267.06287: stdout chunk (state=3): >>><<< 11683 1726853267.06289: stderr chunk (state=3): >>><<< 11683 1726853267.06292: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853267.06295: _low_level_execute_command(): starting 11683 1726853267.06297: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/AnsiballZ_stat.py && sleep 0' 11683 1726853267.06723: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853267.06737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853267.06800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853267.06820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.06915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.22640: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11683 1726853267.24049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853267.24080: stderr chunk (state=3): >>><<< 11683 1726853267.24084: stdout chunk (state=3): >>><<< 11683 1726853267.24102: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853267.24127: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853267.24136: _low_level_execute_command(): starting 11683 1726853267.24140: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853266.96578-12522-174875167039109/ > /dev/null 2>&1 && sleep 0' 11683 1726853267.24602: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853267.24605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853267.24607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853267.24609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853267.24619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853267.24670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853267.24680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853267.24682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.24739: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.26641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853267.26663: stderr chunk (state=3): >>><<< 11683 1726853267.26666: stdout chunk (state=3): >>><<< 11683 1726853267.26680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853267.26686: handler run complete 11683 1726853267.26701: attempt loop complete, returning result 11683 1726853267.26704: _execute() done 11683 1726853267.26706: dumping result to json 11683 1726853267.26710: done dumping result, returning 11683 1726853267.26721: done running TaskExecutor() for managed_node3/TASK: Stat profile file [02083763-bbaf-c5b2-e075-000000000441] 11683 1726853267.26726: sending task result for task 02083763-bbaf-c5b2-e075-000000000441 11683 1726853267.26820: done sending task result for task 02083763-bbaf-c5b2-e075-000000000441 11683 1726853267.26823: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "stat": { "exists": false } } 11683 1726853267.26880: no more pending results, returning what we have 11683 1726853267.26884: results queue empty 11683 1726853267.26885: checking for any_errors_fatal 11683 1726853267.26892: done checking for any_errors_fatal 11683 1726853267.26892: checking for max_fail_percentage 11683 1726853267.26894: done checking for max_fail_percentage 11683 1726853267.26895: checking to see if all hosts have failed and the running result is not ok 11683 1726853267.26896: done checking to see if all hosts have failed 11683 1726853267.26897: getting the remaining hosts for this loop 11683 1726853267.26898: done getting the remaining hosts for this loop 11683 1726853267.26902: getting the next task for host managed_node3 11683 1726853267.26908: done getting next task for host managed_node3 11683 1726853267.26910: ^ task is: TASK: Set NM profile exist flag based on the profile files 11683 1726853267.26914: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853267.26918: getting variables 11683 1726853267.26920: in VariableManager get_vars() 11683 1726853267.26961: Calling all_inventory to load vars for managed_node3 11683 1726853267.26964: Calling groups_inventory to load vars for managed_node3 11683 1726853267.26966: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853267.26984: Calling all_plugins_play to load vars for managed_node3 11683 1726853267.26987: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853267.26990: Calling groups_plugins_play to load vars for managed_node3 11683 1726853267.27936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853267.29739: done with get_vars() 11683 1726853267.29763: done getting variables 11683 1726853267.29821: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 13:27:47 -0400 (0:00:00.378) 0:00:20.370 ****** 11683 1726853267.29854: entering _queue_task() for managed_node3/set_fact 11683 1726853267.30152: worker is 1 (out of 1 available) 11683 1726853267.30164: exiting _queue_task() for managed_node3/set_fact 11683 1726853267.30278: done queuing things up, now waiting for results queue to drain 11683 1726853267.30280: waiting for pending results... 11683 1726853267.30489: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files 11683 1726853267.30590: in run() - task 02083763-bbaf-c5b2-e075-000000000442 11683 1726853267.30614: variable 'ansible_search_path' from source: unknown 11683 1726853267.30778: variable 'ansible_search_path' from source: unknown 11683 1726853267.30781: calling self._execute() 11683 1726853267.30789: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.30793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.30797: variable 'omit' from source: magic vars 11683 1726853267.31172: variable 'ansible_distribution_major_version' from source: facts 11683 1726853267.31190: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853267.31314: variable 'profile_stat' from source: set_fact 11683 1726853267.31333: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853267.31343: when evaluation is False, skipping this task 11683 1726853267.31353: _execute() done 11683 1726853267.31359: dumping result to json 11683 1726853267.31366: done dumping result, returning 11683 1726853267.31378: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag based on the profile files [02083763-bbaf-c5b2-e075-000000000442] 11683 1726853267.31387: sending task result for task 02083763-bbaf-c5b2-e075-000000000442 skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853267.31653: no more pending results, returning what we have 11683 1726853267.31657: results queue empty 11683 1726853267.31658: checking for any_errors_fatal 11683 1726853267.31667: done checking for any_errors_fatal 11683 1726853267.31668: checking for max_fail_percentage 11683 1726853267.31669: done checking for max_fail_percentage 11683 1726853267.31672: checking to see if all hosts have failed and the running result is not ok 11683 1726853267.31673: done checking to see if all hosts have failed 11683 1726853267.31674: getting the remaining hosts for this loop 11683 1726853267.31676: done getting the remaining hosts for this loop 11683 1726853267.31679: getting the next task for host managed_node3 11683 1726853267.31687: done getting next task for host managed_node3 11683 1726853267.31690: ^ task is: TASK: Get NM profile info 11683 1726853267.31694: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853267.31700: getting variables 11683 1726853267.31701: in VariableManager get_vars() 11683 1726853267.31743: Calling all_inventory to load vars for managed_node3 11683 1726853267.31747: Calling groups_inventory to load vars for managed_node3 11683 1726853267.31749: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853267.31764: Calling all_plugins_play to load vars for managed_node3 11683 1726853267.31766: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853267.31769: Calling groups_plugins_play to load vars for managed_node3 11683 1726853267.31946: done sending task result for task 02083763-bbaf-c5b2-e075-000000000442 11683 1726853267.31949: WORKER PROCESS EXITING 11683 1726853267.33198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853267.34645: done with get_vars() 11683 1726853267.34670: done getting variables 11683 1726853267.34729: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 13:27:47 -0400 (0:00:00.049) 0:00:20.419 ****** 11683 1726853267.34759: entering _queue_task() for managed_node3/shell 11683 1726853267.35103: worker is 1 (out of 1 available) 11683 1726853267.35115: exiting _queue_task() for managed_node3/shell 11683 1726853267.35127: done queuing things up, now waiting for results queue to drain 11683 1726853267.35128: waiting for pending results... 11683 1726853267.35647: running TaskExecutor() for managed_node3/TASK: Get NM profile info 11683 1726853267.36177: in run() - task 02083763-bbaf-c5b2-e075-000000000443 11683 1726853267.36181: variable 'ansible_search_path' from source: unknown 11683 1726853267.36184: variable 'ansible_search_path' from source: unknown 11683 1726853267.36187: calling self._execute() 11683 1726853267.36190: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.36193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.36195: variable 'omit' from source: magic vars 11683 1726853267.37008: variable 'ansible_distribution_major_version' from source: facts 11683 1726853267.37048: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853267.37059: variable 'omit' from source: magic vars 11683 1726853267.37190: variable 'omit' from source: magic vars 11683 1726853267.37405: variable 'profile' from source: include params 11683 1726853267.37415: variable 'item' from source: include params 11683 1726853267.37686: variable 'item' from source: include params 11683 1726853267.37689: variable 'omit' from source: magic vars 11683 1726853267.37692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853267.37790: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853267.37905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853267.37908: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853267.37910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853267.37913: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853267.37915: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.37917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.38138: Set connection var ansible_shell_executable to /bin/sh 11683 1726853267.38154: Set connection var ansible_timeout to 10 11683 1726853267.38165: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853267.38176: Set connection var ansible_pipelining to False 11683 1726853267.38182: Set connection var ansible_shell_type to sh 11683 1726853267.38189: Set connection var ansible_connection to ssh 11683 1726853267.38216: variable 'ansible_shell_executable' from source: unknown 11683 1726853267.38228: variable 'ansible_connection' from source: unknown 11683 1726853267.38238: variable 'ansible_module_compression' from source: unknown 11683 1726853267.38245: variable 'ansible_shell_type' from source: unknown 11683 1726853267.38251: variable 'ansible_shell_executable' from source: unknown 11683 1726853267.38259: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.38267: variable 'ansible_pipelining' from source: unknown 11683 1726853267.38277: variable 'ansible_timeout' from source: unknown 11683 1726853267.38285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.38425: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853267.38440: variable 'omit' from source: magic vars 11683 1726853267.38560: starting attempt loop 11683 1726853267.38564: running the handler 11683 1726853267.38566: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853267.38569: _low_level_execute_command(): starting 11683 1726853267.38573: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853267.39445: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853267.39581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853267.39689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.39789: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.41515: stdout chunk (state=3): >>>/root <<< 11683 1726853267.41649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853267.41723: stdout chunk (state=3): >>><<< 11683 1726853267.41726: stderr chunk (state=3): >>><<< 11683 1726853267.41745: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853267.41763: _low_level_execute_command(): starting 11683 1726853267.41894: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027 `" && echo ansible-tmp-1726853267.4175186-12541-229245229177027="` echo /root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027 `" ) && sleep 0' 11683 1726853267.43138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853267.43142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853267.43228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853267.43246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.43389: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.45460: stdout chunk (state=3): >>>ansible-tmp-1726853267.4175186-12541-229245229177027=/root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027 <<< 11683 1726853267.45489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853267.45533: stderr chunk (state=3): >>><<< 11683 1726853267.45539: stdout chunk (state=3): >>><<< 11683 1726853267.45569: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853267.4175186-12541-229245229177027=/root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853267.45738: variable 'ansible_module_compression' from source: unknown 11683 1726853267.45741: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853267.45783: variable 'ansible_facts' from source: unknown 11683 1726853267.46037: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/AnsiballZ_command.py 11683 1726853267.46395: Sending initial data 11683 1726853267.46398: Sent initial data (156 bytes) 11683 1726853267.48126: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853267.48478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853267.48482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.48484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.50161: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853267.50230: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853267.50308: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpzjilf3nz /root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/AnsiballZ_command.py <<< 11683 1726853267.50317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/AnsiballZ_command.py" <<< 11683 1726853267.50369: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpzjilf3nz" to remote "/root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/AnsiballZ_command.py" <<< 11683 1726853267.52281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853267.52785: stderr chunk (state=3): >>><<< 11683 1726853267.52789: stdout chunk (state=3): >>><<< 11683 1726853267.52791: done transferring module to remote 11683 1726853267.52794: _low_level_execute_command(): starting 11683 1726853267.52796: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/ /root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/AnsiballZ_command.py && sleep 0' 11683 1726853267.54279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853267.54392: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.54531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.56391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853267.56432: stderr chunk (state=3): >>><<< 11683 1726853267.56588: stdout chunk (state=3): >>><<< 11683 1726853267.56607: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853267.56610: _low_level_execute_command(): starting 11683 1726853267.56616: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/AnsiballZ_command.py && sleep 0' 11683 1726853267.57678: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853267.57684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853267.57958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853267.57961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853267.57964: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853267.57984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853267.58187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.58360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.76084: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:27:47.738210", "end": "2024-09-20 13:27:47.759538", "delta": "0:00:00.021328", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853267.77970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853267.77977: stdout chunk (state=3): >>><<< 11683 1726853267.77979: stderr chunk (state=3): >>><<< 11683 1726853267.77982: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 13:27:47.738210", "end": "2024-09-20 13:27:47.759538", "delta": "0:00:00.021328", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853267.77985: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853267.77988: _low_level_execute_command(): starting 11683 1726853267.77989: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853267.4175186-12541-229245229177027/ > /dev/null 2>&1 && sleep 0' 11683 1726853267.78575: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853267.78592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853267.78608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853267.78629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853267.78695: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853267.78698: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853267.78821: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853267.78883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853267.80749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853267.80777: stderr chunk (state=3): >>><<< 11683 1726853267.80780: stdout chunk (state=3): >>><<< 11683 1726853267.80795: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853267.80801: handler run complete 11683 1726853267.80818: Evaluated conditional (False): False 11683 1726853267.80832: attempt loop complete, returning result 11683 1726853267.80835: _execute() done 11683 1726853267.80837: dumping result to json 11683 1726853267.80839: done dumping result, returning 11683 1726853267.80847: done running TaskExecutor() for managed_node3/TASK: Get NM profile info [02083763-bbaf-c5b2-e075-000000000443] 11683 1726853267.80850: sending task result for task 02083763-bbaf-c5b2-e075-000000000443 11683 1726853267.80941: done sending task result for task 02083763-bbaf-c5b2-e075-000000000443 11683 1726853267.80946: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.021328", "end": "2024-09-20 13:27:47.759538", "rc": 0, "start": "2024-09-20 13:27:47.738210" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11683 1726853267.81013: no more pending results, returning what we have 11683 1726853267.81017: results queue empty 11683 1726853267.81018: checking for any_errors_fatal 11683 1726853267.81025: done checking for any_errors_fatal 11683 1726853267.81025: checking for max_fail_percentage 11683 1726853267.81027: done checking for max_fail_percentage 11683 1726853267.81028: checking to see if all hosts have failed and the running result is not ok 11683 1726853267.81029: done checking to see if all hosts have failed 11683 1726853267.81030: getting the remaining hosts for this loop 11683 1726853267.81032: done getting the remaining hosts for this loop 11683 1726853267.81035: getting the next task for host managed_node3 11683 1726853267.81042: done getting next task for host managed_node3 11683 1726853267.81046: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11683 1726853267.81050: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853267.81054: getting variables 11683 1726853267.81055: in VariableManager get_vars() 11683 1726853267.81098: Calling all_inventory to load vars for managed_node3 11683 1726853267.81100: Calling groups_inventory to load vars for managed_node3 11683 1726853267.81103: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853267.81112: Calling all_plugins_play to load vars for managed_node3 11683 1726853267.81115: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853267.81117: Calling groups_plugins_play to load vars for managed_node3 11683 1726853267.81996: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853267.83322: done with get_vars() 11683 1726853267.83351: done getting variables 11683 1726853267.83414: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 13:27:47 -0400 (0:00:00.486) 0:00:20.906 ****** 11683 1726853267.83451: entering _queue_task() for managed_node3/set_fact 11683 1726853267.84001: worker is 1 (out of 1 available) 11683 1726853267.84009: exiting _queue_task() for managed_node3/set_fact 11683 1726853267.84024: done queuing things up, now waiting for results queue to drain 11683 1726853267.84026: waiting for pending results... 11683 1726853267.84097: running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11683 1726853267.84201: in run() - task 02083763-bbaf-c5b2-e075-000000000444 11683 1726853267.84212: variable 'ansible_search_path' from source: unknown 11683 1726853267.84217: variable 'ansible_search_path' from source: unknown 11683 1726853267.84245: calling self._execute() 11683 1726853267.84328: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.84332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.84341: variable 'omit' from source: magic vars 11683 1726853267.84622: variable 'ansible_distribution_major_version' from source: facts 11683 1726853267.84631: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853267.84724: variable 'nm_profile_exists' from source: set_fact 11683 1726853267.84736: Evaluated conditional (nm_profile_exists.rc == 0): True 11683 1726853267.84741: variable 'omit' from source: magic vars 11683 1726853267.84781: variable 'omit' from source: magic vars 11683 1726853267.84807: variable 'omit' from source: magic vars 11683 1726853267.84839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853267.84867: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853267.84889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853267.84901: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853267.84913: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853267.84939: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853267.84943: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.84946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.85021: Set connection var ansible_shell_executable to /bin/sh 11683 1726853267.85025: Set connection var ansible_timeout to 10 11683 1726853267.85032: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853267.85036: Set connection var ansible_pipelining to False 11683 1726853267.85039: Set connection var ansible_shell_type to sh 11683 1726853267.85041: Set connection var ansible_connection to ssh 11683 1726853267.85061: variable 'ansible_shell_executable' from source: unknown 11683 1726853267.85064: variable 'ansible_connection' from source: unknown 11683 1726853267.85066: variable 'ansible_module_compression' from source: unknown 11683 1726853267.85069: variable 'ansible_shell_type' from source: unknown 11683 1726853267.85073: variable 'ansible_shell_executable' from source: unknown 11683 1726853267.85075: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.85078: variable 'ansible_pipelining' from source: unknown 11683 1726853267.85080: variable 'ansible_timeout' from source: unknown 11683 1726853267.85085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.85188: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853267.85196: variable 'omit' from source: magic vars 11683 1726853267.85202: starting attempt loop 11683 1726853267.85204: running the handler 11683 1726853267.85214: handler run complete 11683 1726853267.85223: attempt loop complete, returning result 11683 1726853267.85225: _execute() done 11683 1726853267.85230: dumping result to json 11683 1726853267.85232: done dumping result, returning 11683 1726853267.85242: done running TaskExecutor() for managed_node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [02083763-bbaf-c5b2-e075-000000000444] 11683 1726853267.85244: sending task result for task 02083763-bbaf-c5b2-e075-000000000444 11683 1726853267.85317: done sending task result for task 02083763-bbaf-c5b2-e075-000000000444 11683 1726853267.85320: WORKER PROCESS EXITING ok: [managed_node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11683 1726853267.85398: no more pending results, returning what we have 11683 1726853267.85401: results queue empty 11683 1726853267.85402: checking for any_errors_fatal 11683 1726853267.85408: done checking for any_errors_fatal 11683 1726853267.85409: checking for max_fail_percentage 11683 1726853267.85411: done checking for max_fail_percentage 11683 1726853267.85412: checking to see if all hosts have failed and the running result is not ok 11683 1726853267.85413: done checking to see if all hosts have failed 11683 1726853267.85414: getting the remaining hosts for this loop 11683 1726853267.85415: done getting the remaining hosts for this loop 11683 1726853267.85419: getting the next task for host managed_node3 11683 1726853267.85427: done getting next task for host managed_node3 11683 1726853267.85430: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11683 1726853267.85433: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853267.85437: getting variables 11683 1726853267.85438: in VariableManager get_vars() 11683 1726853267.85481: Calling all_inventory to load vars for managed_node3 11683 1726853267.85483: Calling groups_inventory to load vars for managed_node3 11683 1726853267.85485: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853267.85494: Calling all_plugins_play to load vars for managed_node3 11683 1726853267.85496: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853267.85498: Calling groups_plugins_play to load vars for managed_node3 11683 1726853267.86421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853267.87630: done with get_vars() 11683 1726853267.87646: done getting variables 11683 1726853267.87688: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853267.87768: variable 'profile' from source: include params 11683 1726853267.87773: variable 'item' from source: include params 11683 1726853267.87815: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 13:27:47 -0400 (0:00:00.043) 0:00:20.950 ****** 11683 1726853267.87843: entering _queue_task() for managed_node3/command 11683 1726853267.88069: worker is 1 (out of 1 available) 11683 1726853267.88084: exiting _queue_task() for managed_node3/command 11683 1726853267.88096: done queuing things up, now waiting for results queue to drain 11683 1726853267.88097: waiting for pending results... 11683 1726853267.88275: running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11683 1726853267.88356: in run() - task 02083763-bbaf-c5b2-e075-000000000446 11683 1726853267.88367: variable 'ansible_search_path' from source: unknown 11683 1726853267.88372: variable 'ansible_search_path' from source: unknown 11683 1726853267.88399: calling self._execute() 11683 1726853267.88473: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.88477: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.88486: variable 'omit' from source: magic vars 11683 1726853267.88743: variable 'ansible_distribution_major_version' from source: facts 11683 1726853267.88756: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853267.88841: variable 'profile_stat' from source: set_fact 11683 1726853267.88855: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853267.88858: when evaluation is False, skipping this task 11683 1726853267.88862: _execute() done 11683 1726853267.88865: dumping result to json 11683 1726853267.88867: done dumping result, returning 11683 1726853267.88874: done running TaskExecutor() for managed_node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [02083763-bbaf-c5b2-e075-000000000446] 11683 1726853267.88880: sending task result for task 02083763-bbaf-c5b2-e075-000000000446 11683 1726853267.88958: done sending task result for task 02083763-bbaf-c5b2-e075-000000000446 11683 1726853267.88961: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853267.89021: no more pending results, returning what we have 11683 1726853267.89025: results queue empty 11683 1726853267.89026: checking for any_errors_fatal 11683 1726853267.89031: done checking for any_errors_fatal 11683 1726853267.89032: checking for max_fail_percentage 11683 1726853267.89034: done checking for max_fail_percentage 11683 1726853267.89035: checking to see if all hosts have failed and the running result is not ok 11683 1726853267.89036: done checking to see if all hosts have failed 11683 1726853267.89037: getting the remaining hosts for this loop 11683 1726853267.89039: done getting the remaining hosts for this loop 11683 1726853267.89042: getting the next task for host managed_node3 11683 1726853267.89048: done getting next task for host managed_node3 11683 1726853267.89050: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11683 1726853267.89054: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853267.89058: getting variables 11683 1726853267.89059: in VariableManager get_vars() 11683 1726853267.89097: Calling all_inventory to load vars for managed_node3 11683 1726853267.89099: Calling groups_inventory to load vars for managed_node3 11683 1726853267.89101: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853267.89110: Calling all_plugins_play to load vars for managed_node3 11683 1726853267.89113: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853267.89115: Calling groups_plugins_play to load vars for managed_node3 11683 1726853267.89882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853267.90728: done with get_vars() 11683 1726853267.90743: done getting variables 11683 1726853267.90786: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853267.90860: variable 'profile' from source: include params 11683 1726853267.90863: variable 'item' from source: include params 11683 1726853267.90903: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 13:27:47 -0400 (0:00:00.030) 0:00:20.981 ****** 11683 1726853267.90926: entering _queue_task() for managed_node3/set_fact 11683 1726853267.91138: worker is 1 (out of 1 available) 11683 1726853267.91151: exiting _queue_task() for managed_node3/set_fact 11683 1726853267.91164: done queuing things up, now waiting for results queue to drain 11683 1726853267.91165: waiting for pending results... 11683 1726853267.91387: running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11683 1726853267.91519: in run() - task 02083763-bbaf-c5b2-e075-000000000447 11683 1726853267.91539: variable 'ansible_search_path' from source: unknown 11683 1726853267.91547: variable 'ansible_search_path' from source: unknown 11683 1726853267.91596: calling self._execute() 11683 1726853267.91705: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.91717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.91732: variable 'omit' from source: magic vars 11683 1726853267.92088: variable 'ansible_distribution_major_version' from source: facts 11683 1726853267.92105: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853267.92251: variable 'profile_stat' from source: set_fact 11683 1726853267.92265: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853267.92296: when evaluation is False, skipping this task 11683 1726853267.92299: _execute() done 11683 1726853267.92302: dumping result to json 11683 1726853267.92304: done dumping result, returning 11683 1726853267.92313: done running TaskExecutor() for managed_node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [02083763-bbaf-c5b2-e075-000000000447] 11683 1726853267.92318: sending task result for task 02083763-bbaf-c5b2-e075-000000000447 11683 1726853267.92417: done sending task result for task 02083763-bbaf-c5b2-e075-000000000447 11683 1726853267.92421: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853267.92474: no more pending results, returning what we have 11683 1726853267.92479: results queue empty 11683 1726853267.92480: checking for any_errors_fatal 11683 1726853267.92486: done checking for any_errors_fatal 11683 1726853267.92486: checking for max_fail_percentage 11683 1726853267.92488: done checking for max_fail_percentage 11683 1726853267.92489: checking to see if all hosts have failed and the running result is not ok 11683 1726853267.92490: done checking to see if all hosts have failed 11683 1726853267.92491: getting the remaining hosts for this loop 11683 1726853267.92492: done getting the remaining hosts for this loop 11683 1726853267.92496: getting the next task for host managed_node3 11683 1726853267.92501: done getting next task for host managed_node3 11683 1726853267.92503: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11683 1726853267.92507: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853267.92511: getting variables 11683 1726853267.92512: in VariableManager get_vars() 11683 1726853267.92543: Calling all_inventory to load vars for managed_node3 11683 1726853267.92545: Calling groups_inventory to load vars for managed_node3 11683 1726853267.92547: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853267.92558: Calling all_plugins_play to load vars for managed_node3 11683 1726853267.92560: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853267.92567: Calling groups_plugins_play to load vars for managed_node3 11683 1726853267.97215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853267.98716: done with get_vars() 11683 1726853267.98748: done getting variables 11683 1726853267.98804: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853267.98903: variable 'profile' from source: include params 11683 1726853267.98907: variable 'item' from source: include params 11683 1726853267.98964: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 13:27:47 -0400 (0:00:00.080) 0:00:21.061 ****** 11683 1726853267.98994: entering _queue_task() for managed_node3/command 11683 1726853267.99335: worker is 1 (out of 1 available) 11683 1726853267.99348: exiting _queue_task() for managed_node3/command 11683 1726853267.99360: done queuing things up, now waiting for results queue to drain 11683 1726853267.99362: waiting for pending results... 11683 1726853267.99658: running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 11683 1726853267.99823: in run() - task 02083763-bbaf-c5b2-e075-000000000448 11683 1726853267.99828: variable 'ansible_search_path' from source: unknown 11683 1726853267.99832: variable 'ansible_search_path' from source: unknown 11683 1726853267.99835: calling self._execute() 11683 1726853267.99916: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853267.99932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853267.99936: variable 'omit' from source: magic vars 11683 1726853268.00338: variable 'ansible_distribution_major_version' from source: facts 11683 1726853268.00341: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853268.00434: variable 'profile_stat' from source: set_fact 11683 1726853268.00448: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853268.00452: when evaluation is False, skipping this task 11683 1726853268.00455: _execute() done 11683 1726853268.00458: dumping result to json 11683 1726853268.00460: done dumping result, returning 11683 1726853268.00467: done running TaskExecutor() for managed_node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [02083763-bbaf-c5b2-e075-000000000448] 11683 1726853268.00586: sending task result for task 02083763-bbaf-c5b2-e075-000000000448 11683 1726853268.00647: done sending task result for task 02083763-bbaf-c5b2-e075-000000000448 11683 1726853268.00651: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853268.00707: no more pending results, returning what we have 11683 1726853268.00711: results queue empty 11683 1726853268.00712: checking for any_errors_fatal 11683 1726853268.00721: done checking for any_errors_fatal 11683 1726853268.00722: checking for max_fail_percentage 11683 1726853268.00724: done checking for max_fail_percentage 11683 1726853268.00724: checking to see if all hosts have failed and the running result is not ok 11683 1726853268.00726: done checking to see if all hosts have failed 11683 1726853268.00726: getting the remaining hosts for this loop 11683 1726853268.00728: done getting the remaining hosts for this loop 11683 1726853268.00732: getting the next task for host managed_node3 11683 1726853268.00739: done getting next task for host managed_node3 11683 1726853268.00742: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11683 1726853268.00745: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853268.00751: getting variables 11683 1726853268.00756: in VariableManager get_vars() 11683 1726853268.00802: Calling all_inventory to load vars for managed_node3 11683 1726853268.00805: Calling groups_inventory to load vars for managed_node3 11683 1726853268.00807: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853268.00822: Calling all_plugins_play to load vars for managed_node3 11683 1726853268.00825: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853268.00827: Calling groups_plugins_play to load vars for managed_node3 11683 1726853268.02321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853268.03333: done with get_vars() 11683 1726853268.03349: done getting variables 11683 1726853268.03392: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853268.03473: variable 'profile' from source: include params 11683 1726853268.03476: variable 'item' from source: include params 11683 1726853268.03516: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 13:27:48 -0400 (0:00:00.045) 0:00:21.107 ****** 11683 1726853268.03541: entering _queue_task() for managed_node3/set_fact 11683 1726853268.03766: worker is 1 (out of 1 available) 11683 1726853268.03780: exiting _queue_task() for managed_node3/set_fact 11683 1726853268.03791: done queuing things up, now waiting for results queue to drain 11683 1726853268.03793: waiting for pending results... 11683 1726853268.03969: running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11683 1726853268.04179: in run() - task 02083763-bbaf-c5b2-e075-000000000449 11683 1726853268.04185: variable 'ansible_search_path' from source: unknown 11683 1726853268.04188: variable 'ansible_search_path' from source: unknown 11683 1726853268.04191: calling self._execute() 11683 1726853268.04275: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.04294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.04418: variable 'omit' from source: magic vars 11683 1726853268.04701: variable 'ansible_distribution_major_version' from source: facts 11683 1726853268.04731: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853268.04879: variable 'profile_stat' from source: set_fact 11683 1726853268.04896: Evaluated conditional (profile_stat.stat.exists): False 11683 1726853268.04903: when evaluation is False, skipping this task 11683 1726853268.04910: _execute() done 11683 1726853268.04917: dumping result to json 11683 1726853268.04925: done dumping result, returning 11683 1726853268.04934: done running TaskExecutor() for managed_node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [02083763-bbaf-c5b2-e075-000000000449] 11683 1726853268.04943: sending task result for task 02083763-bbaf-c5b2-e075-000000000449 11683 1726853268.05131: done sending task result for task 02083763-bbaf-c5b2-e075-000000000449 11683 1726853268.05134: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11683 1726853268.05186: no more pending results, returning what we have 11683 1726853268.05190: results queue empty 11683 1726853268.05191: checking for any_errors_fatal 11683 1726853268.05197: done checking for any_errors_fatal 11683 1726853268.05198: checking for max_fail_percentage 11683 1726853268.05199: done checking for max_fail_percentage 11683 1726853268.05200: checking to see if all hosts have failed and the running result is not ok 11683 1726853268.05201: done checking to see if all hosts have failed 11683 1726853268.05202: getting the remaining hosts for this loop 11683 1726853268.05203: done getting the remaining hosts for this loop 11683 1726853268.05206: getting the next task for host managed_node3 11683 1726853268.05214: done getting next task for host managed_node3 11683 1726853268.05217: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11683 1726853268.05221: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853268.05227: getting variables 11683 1726853268.05229: in VariableManager get_vars() 11683 1726853268.05273: Calling all_inventory to load vars for managed_node3 11683 1726853268.05276: Calling groups_inventory to load vars for managed_node3 11683 1726853268.05279: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853268.05293: Calling all_plugins_play to load vars for managed_node3 11683 1726853268.05296: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853268.05299: Calling groups_plugins_play to load vars for managed_node3 11683 1726853268.06582: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853268.07426: done with get_vars() 11683 1726853268.07442: done getting variables 11683 1726853268.07486: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853268.07563: variable 'profile' from source: include params 11683 1726853268.07566: variable 'item' from source: include params 11683 1726853268.07605: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 13:27:48 -0400 (0:00:00.040) 0:00:21.148 ****** 11683 1726853268.07626: entering _queue_task() for managed_node3/assert 11683 1726853268.07840: worker is 1 (out of 1 available) 11683 1726853268.07852: exiting _queue_task() for managed_node3/assert 11683 1726853268.07864: done queuing things up, now waiting for results queue to drain 11683 1726853268.07866: waiting for pending results... 11683 1726853268.08290: running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' 11683 1726853268.08295: in run() - task 02083763-bbaf-c5b2-e075-00000000026e 11683 1726853268.08298: variable 'ansible_search_path' from source: unknown 11683 1726853268.08300: variable 'ansible_search_path' from source: unknown 11683 1726853268.08321: calling self._execute() 11683 1726853268.08424: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.08434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.08448: variable 'omit' from source: magic vars 11683 1726853268.08798: variable 'ansible_distribution_major_version' from source: facts 11683 1726853268.08816: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853268.08876: variable 'omit' from source: magic vars 11683 1726853268.08879: variable 'omit' from source: magic vars 11683 1726853268.08964: variable 'profile' from source: include params 11683 1726853268.08978: variable 'item' from source: include params 11683 1726853268.09075: variable 'item' from source: include params 11683 1726853268.09079: variable 'omit' from source: magic vars 11683 1726853268.09107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853268.09144: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853268.09168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853268.09191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.09209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.09243: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853268.09276: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.09279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.09578: Set connection var ansible_shell_executable to /bin/sh 11683 1726853268.09581: Set connection var ansible_timeout to 10 11683 1726853268.09583: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853268.09585: Set connection var ansible_pipelining to False 11683 1726853268.09587: Set connection var ansible_shell_type to sh 11683 1726853268.09590: Set connection var ansible_connection to ssh 11683 1726853268.09592: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.09594: variable 'ansible_connection' from source: unknown 11683 1726853268.09596: variable 'ansible_module_compression' from source: unknown 11683 1726853268.09598: variable 'ansible_shell_type' from source: unknown 11683 1726853268.09600: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.09602: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.09604: variable 'ansible_pipelining' from source: unknown 11683 1726853268.09606: variable 'ansible_timeout' from source: unknown 11683 1726853268.09608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.09729: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853268.09747: variable 'omit' from source: magic vars 11683 1726853268.09757: starting attempt loop 11683 1726853268.09763: running the handler 11683 1726853268.09875: variable 'lsr_net_profile_exists' from source: set_fact 11683 1726853268.09887: Evaluated conditional (lsr_net_profile_exists): True 11683 1726853268.09897: handler run complete 11683 1726853268.09916: attempt loop complete, returning result 11683 1726853268.09976: _execute() done 11683 1726853268.09980: dumping result to json 11683 1726853268.09983: done dumping result, returning 11683 1726853268.09985: done running TaskExecutor() for managed_node3/TASK: Assert that the profile is present - 'bond0.1' [02083763-bbaf-c5b2-e075-00000000026e] 11683 1726853268.09987: sending task result for task 02083763-bbaf-c5b2-e075-00000000026e 11683 1726853268.10276: done sending task result for task 02083763-bbaf-c5b2-e075-00000000026e 11683 1726853268.10280: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853268.10316: no more pending results, returning what we have 11683 1726853268.10319: results queue empty 11683 1726853268.10320: checking for any_errors_fatal 11683 1726853268.10325: done checking for any_errors_fatal 11683 1726853268.10326: checking for max_fail_percentage 11683 1726853268.10327: done checking for max_fail_percentage 11683 1726853268.10328: checking to see if all hosts have failed and the running result is not ok 11683 1726853268.10329: done checking to see if all hosts have failed 11683 1726853268.10329: getting the remaining hosts for this loop 11683 1726853268.10331: done getting the remaining hosts for this loop 11683 1726853268.10333: getting the next task for host managed_node3 11683 1726853268.10338: done getting next task for host managed_node3 11683 1726853268.10341: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11683 1726853268.10343: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853268.10347: getting variables 11683 1726853268.10348: in VariableManager get_vars() 11683 1726853268.10384: Calling all_inventory to load vars for managed_node3 11683 1726853268.10386: Calling groups_inventory to load vars for managed_node3 11683 1726853268.10389: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853268.10398: Calling all_plugins_play to load vars for managed_node3 11683 1726853268.10401: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853268.10404: Calling groups_plugins_play to load vars for managed_node3 11683 1726853268.11652: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853268.13339: done with get_vars() 11683 1726853268.13362: done getting variables 11683 1726853268.13420: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853268.13532: variable 'profile' from source: include params 11683 1726853268.13536: variable 'item' from source: include params 11683 1726853268.13594: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 13:27:48 -0400 (0:00:00.059) 0:00:21.208 ****** 11683 1726853268.13634: entering _queue_task() for managed_node3/assert 11683 1726853268.14180: worker is 1 (out of 1 available) 11683 1726853268.14190: exiting _queue_task() for managed_node3/assert 11683 1726853268.14201: done queuing things up, now waiting for results queue to drain 11683 1726853268.14202: waiting for pending results... 11683 1726853268.14465: running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11683 1726853268.14590: in run() - task 02083763-bbaf-c5b2-e075-00000000026f 11683 1726853268.14612: variable 'ansible_search_path' from source: unknown 11683 1726853268.14621: variable 'ansible_search_path' from source: unknown 11683 1726853268.14673: calling self._execute() 11683 1726853268.14767: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.14782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.14796: variable 'omit' from source: magic vars 11683 1726853268.15170: variable 'ansible_distribution_major_version' from source: facts 11683 1726853268.15191: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853268.15205: variable 'omit' from source: magic vars 11683 1726853268.15244: variable 'omit' from source: magic vars 11683 1726853268.15349: variable 'profile' from source: include params 11683 1726853268.15360: variable 'item' from source: include params 11683 1726853268.15432: variable 'item' from source: include params 11683 1726853268.15480: variable 'omit' from source: magic vars 11683 1726853268.15527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853268.15585: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853268.15646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853268.15650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.15664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.15701: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853268.15709: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.15716: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.15852: Set connection var ansible_shell_executable to /bin/sh 11683 1726853268.15959: Set connection var ansible_timeout to 10 11683 1726853268.15962: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853268.15965: Set connection var ansible_pipelining to False 11683 1726853268.15966: Set connection var ansible_shell_type to sh 11683 1726853268.15969: Set connection var ansible_connection to ssh 11683 1726853268.15972: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.15974: variable 'ansible_connection' from source: unknown 11683 1726853268.15976: variable 'ansible_module_compression' from source: unknown 11683 1726853268.15978: variable 'ansible_shell_type' from source: unknown 11683 1726853268.15980: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.15982: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.15984: variable 'ansible_pipelining' from source: unknown 11683 1726853268.15986: variable 'ansible_timeout' from source: unknown 11683 1726853268.15988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.16110: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853268.16126: variable 'omit' from source: magic vars 11683 1726853268.16135: starting attempt loop 11683 1726853268.16175: running the handler 11683 1726853268.16261: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11683 1726853268.16272: Evaluated conditional (lsr_net_profile_ansible_managed): True 11683 1726853268.16283: handler run complete 11683 1726853268.16300: attempt loop complete, returning result 11683 1726853268.16306: _execute() done 11683 1726853268.16314: dumping result to json 11683 1726853268.16324: done dumping result, returning 11683 1726853268.16375: done running TaskExecutor() for managed_node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [02083763-bbaf-c5b2-e075-00000000026f] 11683 1726853268.16378: sending task result for task 02083763-bbaf-c5b2-e075-00000000026f 11683 1726853268.16699: done sending task result for task 02083763-bbaf-c5b2-e075-00000000026f 11683 1726853268.16702: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853268.16742: no more pending results, returning what we have 11683 1726853268.16744: results queue empty 11683 1726853268.16745: checking for any_errors_fatal 11683 1726853268.16751: done checking for any_errors_fatal 11683 1726853268.16751: checking for max_fail_percentage 11683 1726853268.16753: done checking for max_fail_percentage 11683 1726853268.16754: checking to see if all hosts have failed and the running result is not ok 11683 1726853268.16755: done checking to see if all hosts have failed 11683 1726853268.16756: getting the remaining hosts for this loop 11683 1726853268.16757: done getting the remaining hosts for this loop 11683 1726853268.16760: getting the next task for host managed_node3 11683 1726853268.16765: done getting next task for host managed_node3 11683 1726853268.16768: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11683 1726853268.16772: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853268.16776: getting variables 11683 1726853268.16777: in VariableManager get_vars() 11683 1726853268.16813: Calling all_inventory to load vars for managed_node3 11683 1726853268.16816: Calling groups_inventory to load vars for managed_node3 11683 1726853268.16819: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853268.16828: Calling all_plugins_play to load vars for managed_node3 11683 1726853268.16831: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853268.16834: Calling groups_plugins_play to load vars for managed_node3 11683 1726853268.18758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853268.20624: done with get_vars() 11683 1726853268.20653: done getting variables 11683 1726853268.20716: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853268.20830: variable 'profile' from source: include params 11683 1726853268.20834: variable 'item' from source: include params 11683 1726853268.20895: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 13:27:48 -0400 (0:00:00.073) 0:00:21.281 ****** 11683 1726853268.20932: entering _queue_task() for managed_node3/assert 11683 1726853268.21267: worker is 1 (out of 1 available) 11683 1726853268.21482: exiting _queue_task() for managed_node3/assert 11683 1726853268.21492: done queuing things up, now waiting for results queue to drain 11683 1726853268.21494: waiting for pending results... 11683 1726853268.21581: running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 11683 1726853268.21699: in run() - task 02083763-bbaf-c5b2-e075-000000000270 11683 1726853268.21724: variable 'ansible_search_path' from source: unknown 11683 1726853268.21731: variable 'ansible_search_path' from source: unknown 11683 1726853268.21774: calling self._execute() 11683 1726853268.21886: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.21897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.21911: variable 'omit' from source: magic vars 11683 1726853268.22282: variable 'ansible_distribution_major_version' from source: facts 11683 1726853268.22299: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853268.22309: variable 'omit' from source: magic vars 11683 1726853268.22352: variable 'omit' from source: magic vars 11683 1726853268.22456: variable 'profile' from source: include params 11683 1726853268.22464: variable 'item' from source: include params 11683 1726853268.22531: variable 'item' from source: include params 11683 1726853268.22555: variable 'omit' from source: magic vars 11683 1726853268.22609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853268.22653: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853268.22683: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853268.22713: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.22731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.22768: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853268.22778: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.22785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.22889: Set connection var ansible_shell_executable to /bin/sh 11683 1726853268.22907: Set connection var ansible_timeout to 10 11683 1726853268.22924: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853268.22934: Set connection var ansible_pipelining to False 11683 1726853268.22940: Set connection var ansible_shell_type to sh 11683 1726853268.22946: Set connection var ansible_connection to ssh 11683 1726853268.22973: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.22980: variable 'ansible_connection' from source: unknown 11683 1726853268.23023: variable 'ansible_module_compression' from source: unknown 11683 1726853268.23026: variable 'ansible_shell_type' from source: unknown 11683 1726853268.23028: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.23030: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.23032: variable 'ansible_pipelining' from source: unknown 11683 1726853268.23035: variable 'ansible_timeout' from source: unknown 11683 1726853268.23037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.23158: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853268.23175: variable 'omit' from source: magic vars 11683 1726853268.23184: starting attempt loop 11683 1726853268.23239: running the handler 11683 1726853268.23299: variable 'lsr_net_profile_fingerprint' from source: set_fact 11683 1726853268.23309: Evaluated conditional (lsr_net_profile_fingerprint): True 11683 1726853268.23317: handler run complete 11683 1726853268.23334: attempt loop complete, returning result 11683 1726853268.23340: _execute() done 11683 1726853268.23352: dumping result to json 11683 1726853268.23359: done dumping result, returning 11683 1726853268.23369: done running TaskExecutor() for managed_node3/TASK: Assert that the fingerprint comment is present in bond0.1 [02083763-bbaf-c5b2-e075-000000000270] 11683 1726853268.23380: sending task result for task 02083763-bbaf-c5b2-e075-000000000270 ok: [managed_node3] => { "changed": false } MSG: All assertions passed 11683 1726853268.23518: no more pending results, returning what we have 11683 1726853268.23521: results queue empty 11683 1726853268.23522: checking for any_errors_fatal 11683 1726853268.23529: done checking for any_errors_fatal 11683 1726853268.23529: checking for max_fail_percentage 11683 1726853268.23531: done checking for max_fail_percentage 11683 1726853268.23532: checking to see if all hosts have failed and the running result is not ok 11683 1726853268.23533: done checking to see if all hosts have failed 11683 1726853268.23534: getting the remaining hosts for this loop 11683 1726853268.23536: done getting the remaining hosts for this loop 11683 1726853268.23538: getting the next task for host managed_node3 11683 1726853268.23548: done getting next task for host managed_node3 11683 1726853268.23551: ^ task is: TASK: ** TEST check polling interval 11683 1726853268.23553: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853268.23558: getting variables 11683 1726853268.23559: in VariableManager get_vars() 11683 1726853268.23604: Calling all_inventory to load vars for managed_node3 11683 1726853268.23606: Calling groups_inventory to load vars for managed_node3 11683 1726853268.23609: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853268.23620: Calling all_plugins_play to load vars for managed_node3 11683 1726853268.23623: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853268.23626: Calling groups_plugins_play to load vars for managed_node3 11683 1726853268.24484: done sending task result for task 02083763-bbaf-c5b2-e075-000000000270 11683 1726853268.24487: WORKER PROCESS EXITING 11683 1726853268.25163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853268.26820: done with get_vars() 11683 1726853268.26842: done getting variables 11683 1726853268.26907: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Friday 20 September 2024 13:27:48 -0400 (0:00:00.060) 0:00:21.341 ****** 11683 1726853268.26938: entering _queue_task() for managed_node3/command 11683 1726853268.27274: worker is 1 (out of 1 available) 11683 1726853268.27287: exiting _queue_task() for managed_node3/command 11683 1726853268.27300: done queuing things up, now waiting for results queue to drain 11683 1726853268.27302: waiting for pending results... 11683 1726853268.27582: running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval 11683 1726853268.27687: in run() - task 02083763-bbaf-c5b2-e075-000000000071 11683 1726853268.27713: variable 'ansible_search_path' from source: unknown 11683 1726853268.27755: calling self._execute() 11683 1726853268.27860: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.27874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.27892: variable 'omit' from source: magic vars 11683 1726853268.28266: variable 'ansible_distribution_major_version' from source: facts 11683 1726853268.28288: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853268.28301: variable 'omit' from source: magic vars 11683 1726853268.28326: variable 'omit' from source: magic vars 11683 1726853268.28458: variable 'controller_device' from source: play vars 11683 1726853268.28461: variable 'omit' from source: magic vars 11683 1726853268.28504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853268.28543: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853268.28574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853268.28674: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.28677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.28680: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853268.28682: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.28685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.28748: Set connection var ansible_shell_executable to /bin/sh 11683 1726853268.28764: Set connection var ansible_timeout to 10 11683 1726853268.28776: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853268.28785: Set connection var ansible_pipelining to False 11683 1726853268.28791: Set connection var ansible_shell_type to sh 11683 1726853268.28802: Set connection var ansible_connection to ssh 11683 1726853268.28826: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.28875: variable 'ansible_connection' from source: unknown 11683 1726853268.28878: variable 'ansible_module_compression' from source: unknown 11683 1726853268.28881: variable 'ansible_shell_type' from source: unknown 11683 1726853268.28882: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.28884: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.28886: variable 'ansible_pipelining' from source: unknown 11683 1726853268.28888: variable 'ansible_timeout' from source: unknown 11683 1726853268.28890: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.29006: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853268.29027: variable 'omit' from source: magic vars 11683 1726853268.29036: starting attempt loop 11683 1726853268.29042: running the handler 11683 1726853268.29062: _low_level_execute_command(): starting 11683 1726853268.29125: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853268.29907: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853268.29925: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853268.29942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.29967: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.30070: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.31800: stdout chunk (state=3): >>>/root <<< 11683 1726853268.31938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853268.31953: stdout chunk (state=3): >>><<< 11683 1726853268.31977: stderr chunk (state=3): >>><<< 11683 1726853268.32096: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853268.32100: _low_level_execute_command(): starting 11683 1726853268.32103: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776 `" && echo ansible-tmp-1726853268.3200328-12594-6296420573776="` echo /root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776 `" ) && sleep 0' 11683 1726853268.32664: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853268.32683: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853268.32700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853268.32727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853268.32832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.32860: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.32962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.34952: stdout chunk (state=3): >>>ansible-tmp-1726853268.3200328-12594-6296420573776=/root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776 <<< 11683 1726853268.35128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853268.35131: stdout chunk (state=3): >>><<< 11683 1726853268.35134: stderr chunk (state=3): >>><<< 11683 1726853268.35154: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853268.3200328-12594-6296420573776=/root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853268.35198: variable 'ansible_module_compression' from source: unknown 11683 1726853268.35341: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853268.35345: variable 'ansible_facts' from source: unknown 11683 1726853268.35406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/AnsiballZ_command.py 11683 1726853268.35596: Sending initial data 11683 1726853268.35599: Sent initial data (154 bytes) 11683 1726853268.36287: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853268.36345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853268.36364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.36390: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.36543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.38145: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11683 1726853268.38160: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11683 1726853268.38181: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853268.38276: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853268.38354: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpzhwb3xqa /root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/AnsiballZ_command.py <<< 11683 1726853268.38357: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/AnsiballZ_command.py" <<< 11683 1726853268.38414: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpzhwb3xqa" to remote "/root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/AnsiballZ_command.py" <<< 11683 1726853268.39349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853268.39352: stdout chunk (state=3): >>><<< 11683 1726853268.39354: stderr chunk (state=3): >>><<< 11683 1726853268.39362: done transferring module to remote 11683 1726853268.39380: _low_level_execute_command(): starting 11683 1726853268.39390: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/ /root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/AnsiballZ_command.py && sleep 0' 11683 1726853268.40017: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853268.40032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853268.40047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853268.40083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853268.40099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853268.40119: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853268.40186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853268.40198: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853268.40216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.40242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.40340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.42287: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853268.42291: stdout chunk (state=3): >>><<< 11683 1726853268.42294: stderr chunk (state=3): >>><<< 11683 1726853268.42313: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853268.42402: _low_level_execute_command(): starting 11683 1726853268.42406: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/AnsiballZ_command.py && sleep 0' 11683 1726853268.42964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853268.42980: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853268.42993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853268.43017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853268.43085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853268.43130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853268.43149: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.43169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.43267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.59056: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 13:27:48.585726", "end": "2024-09-20 13:27:48.589271", "delta": "0:00:00.003545", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853268.60629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853268.60663: stderr chunk (state=3): >>><<< 11683 1726853268.60667: stdout chunk (state=3): >>><<< 11683 1726853268.60687: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 13:27:48.585726", "end": "2024-09-20 13:27:48.589271", "delta": "0:00:00.003545", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853268.60716: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853268.60723: _low_level_execute_command(): starting 11683 1726853268.60728: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853268.3200328-12594-6296420573776/ > /dev/null 2>&1 && sleep 0' 11683 1726853268.61155: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853268.61162: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853268.61190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853268.61193: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853268.61196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853268.61283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853268.61287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.61291: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.61354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.63277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853268.63282: stdout chunk (state=3): >>><<< 11683 1726853268.63285: stderr chunk (state=3): >>><<< 11683 1726853268.63378: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853268.63381: handler run complete 11683 1726853268.63384: Evaluated conditional (False): False 11683 1726853268.63538: variable 'result' from source: unknown 11683 1726853268.63604: Evaluated conditional ('110' in result.stdout): True 11683 1726853268.63617: attempt loop complete, returning result 11683 1726853268.63628: _execute() done 11683 1726853268.63640: dumping result to json 11683 1726853268.63650: done dumping result, returning 11683 1726853268.63662: done running TaskExecutor() for managed_node3/TASK: ** TEST check polling interval [02083763-bbaf-c5b2-e075-000000000071] 11683 1726853268.63676: sending task result for task 02083763-bbaf-c5b2-e075-000000000071 11683 1726853268.63798: done sending task result for task 02083763-bbaf-c5b2-e075-000000000071 11683 1726853268.63802: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003545", "end": "2024-09-20 13:27:48.589271", "rc": 0, "start": "2024-09-20 13:27:48.585726" } STDOUT: MII Polling Interval (ms): 110 11683 1726853268.63878: no more pending results, returning what we have 11683 1726853268.63889: results queue empty 11683 1726853268.63889: checking for any_errors_fatal 11683 1726853268.63896: done checking for any_errors_fatal 11683 1726853268.63897: checking for max_fail_percentage 11683 1726853268.63899: done checking for max_fail_percentage 11683 1726853268.63899: checking to see if all hosts have failed and the running result is not ok 11683 1726853268.63901: done checking to see if all hosts have failed 11683 1726853268.63901: getting the remaining hosts for this loop 11683 1726853268.63903: done getting the remaining hosts for this loop 11683 1726853268.63906: getting the next task for host managed_node3 11683 1726853268.63912: done getting next task for host managed_node3 11683 1726853268.63915: ^ task is: TASK: ** TEST check IPv4 11683 1726853268.63917: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853268.63921: getting variables 11683 1726853268.63922: in VariableManager get_vars() 11683 1726853268.63960: Calling all_inventory to load vars for managed_node3 11683 1726853268.63962: Calling groups_inventory to load vars for managed_node3 11683 1726853268.63964: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853268.63976: Calling all_plugins_play to load vars for managed_node3 11683 1726853268.63979: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853268.63981: Calling groups_plugins_play to load vars for managed_node3 11683 1726853268.64776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853268.65786: done with get_vars() 11683 1726853268.65808: done getting variables 11683 1726853268.65869: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Friday 20 September 2024 13:27:48 -0400 (0:00:00.389) 0:00:21.730 ****** 11683 1726853268.65899: entering _queue_task() for managed_node3/command 11683 1726853268.66199: worker is 1 (out of 1 available) 11683 1726853268.66211: exiting _queue_task() for managed_node3/command 11683 1726853268.66223: done queuing things up, now waiting for results queue to drain 11683 1726853268.66225: waiting for pending results... 11683 1726853268.66625: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 11683 1726853268.66630: in run() - task 02083763-bbaf-c5b2-e075-000000000072 11683 1726853268.66634: variable 'ansible_search_path' from source: unknown 11683 1726853268.66637: calling self._execute() 11683 1726853268.66720: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.66725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.66735: variable 'omit' from source: magic vars 11683 1726853268.67108: variable 'ansible_distribution_major_version' from source: facts 11683 1726853268.67119: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853268.67129: variable 'omit' from source: magic vars 11683 1726853268.67153: variable 'omit' from source: magic vars 11683 1726853268.67241: variable 'controller_device' from source: play vars 11683 1726853268.67262: variable 'omit' from source: magic vars 11683 1726853268.67302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853268.67335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853268.67358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853268.67376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.67388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853268.67417: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853268.67421: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.67424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.67527: Set connection var ansible_shell_executable to /bin/sh 11683 1726853268.67538: Set connection var ansible_timeout to 10 11683 1726853268.67585: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853268.67588: Set connection var ansible_pipelining to False 11683 1726853268.67591: Set connection var ansible_shell_type to sh 11683 1726853268.67593: Set connection var ansible_connection to ssh 11683 1726853268.67595: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.67597: variable 'ansible_connection' from source: unknown 11683 1726853268.67600: variable 'ansible_module_compression' from source: unknown 11683 1726853268.67602: variable 'ansible_shell_type' from source: unknown 11683 1726853268.67604: variable 'ansible_shell_executable' from source: unknown 11683 1726853268.67606: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853268.67608: variable 'ansible_pipelining' from source: unknown 11683 1726853268.67610: variable 'ansible_timeout' from source: unknown 11683 1726853268.67613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853268.67735: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853268.67803: variable 'omit' from source: magic vars 11683 1726853268.67806: starting attempt loop 11683 1726853268.67809: running the handler 11683 1726853268.67811: _low_level_execute_command(): starting 11683 1726853268.67814: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853268.68501: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853268.68512: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853268.68523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853268.68537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853268.68553: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853268.68560: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853268.68652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.68703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.68774: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.70492: stdout chunk (state=3): >>>/root <<< 11683 1726853268.70652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853268.70655: stdout chunk (state=3): >>><<< 11683 1726853268.70657: stderr chunk (state=3): >>><<< 11683 1726853268.70759: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853268.70764: _low_level_execute_command(): starting 11683 1726853268.70768: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947 `" && echo ansible-tmp-1726853268.706829-12618-111442622555947="` echo /root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947 `" ) && sleep 0' 11683 1726853268.71344: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853268.71360: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853268.71441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853268.71492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853268.71509: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.71533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.71630: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.73595: stdout chunk (state=3): >>>ansible-tmp-1726853268.706829-12618-111442622555947=/root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947 <<< 11683 1726853268.73739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853268.73749: stdout chunk (state=3): >>><<< 11683 1726853268.73770: stderr chunk (state=3): >>><<< 11683 1726853268.73876: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853268.706829-12618-111442622555947=/root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853268.73880: variable 'ansible_module_compression' from source: unknown 11683 1726853268.73882: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853268.73928: variable 'ansible_facts' from source: unknown 11683 1726853268.74028: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/AnsiballZ_command.py 11683 1726853268.74268: Sending initial data 11683 1726853268.74286: Sent initial data (155 bytes) 11683 1726853268.74848: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853268.74865: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.74878: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.74963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.76576: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11683 1726853268.76628: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11683 1726853268.76631: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11683 1726853268.76633: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11683 1726853268.76635: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853268.76703: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853268.76766: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpgyv_94g8 /root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/AnsiballZ_command.py <<< 11683 1726853268.76776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/AnsiballZ_command.py" <<< 11683 1726853268.76848: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpgyv_94g8" to remote "/root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/AnsiballZ_command.py" <<< 11683 1726853268.78176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853268.78186: stderr chunk (state=3): >>><<< 11683 1726853268.78189: stdout chunk (state=3): >>><<< 11683 1726853268.78227: done transferring module to remote 11683 1726853268.78239: _low_level_execute_command(): starting 11683 1726853268.78247: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/ /root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/AnsiballZ_command.py && sleep 0' 11683 1726853268.78811: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853268.78826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853268.78837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853268.78976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853268.78979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853268.78982: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853268.78985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853268.78988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853268.78990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.78997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.79078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.81332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853268.81336: stdout chunk (state=3): >>><<< 11683 1726853268.81378: stderr chunk (state=3): >>><<< 11683 1726853268.81381: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853268.81384: _low_level_execute_command(): starting 11683 1726853268.81386: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/AnsiballZ_command.py && sleep 0' 11683 1726853268.82693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853268.82736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853268.82765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853268.82791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853268.82892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853268.98920: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.196/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:27:48.983983", "end": "2024-09-20 13:27:48.987879", "delta": "0:00:00.003896", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853269.00551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853269.00555: stdout chunk (state=3): >>><<< 11683 1726853269.00677: stderr chunk (state=3): >>><<< 11683 1726853269.00681: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.196/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 13:27:48.983983", "end": "2024-09-20 13:27:48.987879", "delta": "0:00:00.003896", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853269.00683: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853269.00686: _low_level_execute_command(): starting 11683 1726853269.00689: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853268.706829-12618-111442622555947/ > /dev/null 2>&1 && sleep 0' 11683 1726853269.01561: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853269.01584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853269.01609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853269.01618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853269.01689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.01705: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853269.01728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853269.01739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.01825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.04177: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853269.04181: stdout chunk (state=3): >>><<< 11683 1726853269.04183: stderr chunk (state=3): >>><<< 11683 1726853269.04186: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853269.04188: handler run complete 11683 1726853269.04190: Evaluated conditional (False): False 11683 1726853269.04192: variable 'result' from source: set_fact 11683 1726853269.04194: Evaluated conditional ('192.0.2' in result.stdout): True 11683 1726853269.04196: attempt loop complete, returning result 11683 1726853269.04198: _execute() done 11683 1726853269.04199: dumping result to json 11683 1726853269.04201: done dumping result, returning 11683 1726853269.04203: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv4 [02083763-bbaf-c5b2-e075-000000000072] 11683 1726853269.04205: sending task result for task 02083763-bbaf-c5b2-e075-000000000072 ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003896", "end": "2024-09-20 13:27:48.987879", "rc": 0, "start": "2024-09-20 13:27:48.983983" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.196/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 236sec preferred_lft 236sec 11683 1726853269.04377: no more pending results, returning what we have 11683 1726853269.04382: results queue empty 11683 1726853269.04383: checking for any_errors_fatal 11683 1726853269.04395: done checking for any_errors_fatal 11683 1726853269.04396: checking for max_fail_percentage 11683 1726853269.04398: done checking for max_fail_percentage 11683 1726853269.04399: checking to see if all hosts have failed and the running result is not ok 11683 1726853269.04400: done checking to see if all hosts have failed 11683 1726853269.04401: getting the remaining hosts for this loop 11683 1726853269.04402: done getting the remaining hosts for this loop 11683 1726853269.04406: getting the next task for host managed_node3 11683 1726853269.04412: done getting next task for host managed_node3 11683 1726853269.04415: ^ task is: TASK: ** TEST check IPv6 11683 1726853269.04417: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853269.04421: getting variables 11683 1726853269.04423: in VariableManager get_vars() 11683 1726853269.04464: Calling all_inventory to load vars for managed_node3 11683 1726853269.04467: Calling groups_inventory to load vars for managed_node3 11683 1726853269.04469: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853269.04679: done sending task result for task 02083763-bbaf-c5b2-e075-000000000072 11683 1726853269.04683: WORKER PROCESS EXITING 11683 1726853269.04693: Calling all_plugins_play to load vars for managed_node3 11683 1726853269.04696: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853269.04699: Calling groups_plugins_play to load vars for managed_node3 11683 1726853269.06211: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853269.07744: done with get_vars() 11683 1726853269.07768: done getting variables 11683 1726853269.07836: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Friday 20 September 2024 13:27:49 -0400 (0:00:00.419) 0:00:22.150 ****** 11683 1726853269.07865: entering _queue_task() for managed_node3/command 11683 1726853269.08390: worker is 1 (out of 1 available) 11683 1726853269.08400: exiting _queue_task() for managed_node3/command 11683 1726853269.08410: done queuing things up, now waiting for results queue to drain 11683 1726853269.08412: waiting for pending results... 11683 1726853269.08502: running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 11683 1726853269.08601: in run() - task 02083763-bbaf-c5b2-e075-000000000073 11683 1726853269.08621: variable 'ansible_search_path' from source: unknown 11683 1726853269.08663: calling self._execute() 11683 1726853269.08769: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.08783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.08797: variable 'omit' from source: magic vars 11683 1726853269.09162: variable 'ansible_distribution_major_version' from source: facts 11683 1726853269.09183: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853269.09194: variable 'omit' from source: magic vars 11683 1726853269.09216: variable 'omit' from source: magic vars 11683 1726853269.09311: variable 'controller_device' from source: play vars 11683 1726853269.09336: variable 'omit' from source: magic vars 11683 1726853269.09387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853269.09431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853269.09456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853269.09483: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853269.09499: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853269.09531: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853269.09576: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.09579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.09642: Set connection var ansible_shell_executable to /bin/sh 11683 1726853269.09656: Set connection var ansible_timeout to 10 11683 1726853269.09665: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853269.09674: Set connection var ansible_pipelining to False 11683 1726853269.09681: Set connection var ansible_shell_type to sh 11683 1726853269.09686: Set connection var ansible_connection to ssh 11683 1726853269.09711: variable 'ansible_shell_executable' from source: unknown 11683 1726853269.09723: variable 'ansible_connection' from source: unknown 11683 1726853269.09776: variable 'ansible_module_compression' from source: unknown 11683 1726853269.09779: variable 'ansible_shell_type' from source: unknown 11683 1726853269.09782: variable 'ansible_shell_executable' from source: unknown 11683 1726853269.09784: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.09786: variable 'ansible_pipelining' from source: unknown 11683 1726853269.09788: variable 'ansible_timeout' from source: unknown 11683 1726853269.09790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.09899: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853269.09915: variable 'omit' from source: magic vars 11683 1726853269.09923: starting attempt loop 11683 1726853269.09929: running the handler 11683 1726853269.09950: _low_level_execute_command(): starting 11683 1726853269.09960: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853269.10652: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853269.10689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.10784: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853269.10798: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853269.10810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853269.10825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.10915: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.12625: stdout chunk (state=3): >>>/root <<< 11683 1726853269.12787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853269.12792: stdout chunk (state=3): >>><<< 11683 1726853269.12795: stderr chunk (state=3): >>><<< 11683 1726853269.12831: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853269.12870: _low_level_execute_command(): starting 11683 1726853269.12885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541 `" && echo ansible-tmp-1726853269.1285598-12655-235914236734541="` echo /root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541 `" ) && sleep 0' 11683 1726853269.13634: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853269.13649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853269.13665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853269.13686: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853269.13751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.13858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853269.13878: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853269.13903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.14066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.16053: stdout chunk (state=3): >>>ansible-tmp-1726853269.1285598-12655-235914236734541=/root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541 <<< 11683 1726853269.16225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853269.16229: stdout chunk (state=3): >>><<< 11683 1726853269.16232: stderr chunk (state=3): >>><<< 11683 1726853269.16252: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853269.1285598-12655-235914236734541=/root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853269.16288: variable 'ansible_module_compression' from source: unknown 11683 1726853269.16348: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853269.16482: variable 'ansible_facts' from source: unknown 11683 1726853269.16485: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/AnsiballZ_command.py 11683 1726853269.16786: Sending initial data 11683 1726853269.16790: Sent initial data (156 bytes) 11683 1726853269.17290: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853269.17300: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853269.17311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853269.17329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853269.17345: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853269.17356: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853269.17366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.17382: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853269.17458: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.17478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853269.17490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853269.17532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.17600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.19212: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853269.19263: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853269.19325: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpaaenf0vo /root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/AnsiballZ_command.py <<< 11683 1726853269.19330: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/AnsiballZ_command.py" <<< 11683 1726853269.19383: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpaaenf0vo" to remote "/root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/AnsiballZ_command.py" <<< 11683 1726853269.20003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853269.20045: stderr chunk (state=3): >>><<< 11683 1726853269.20050: stdout chunk (state=3): >>><<< 11683 1726853269.20069: done transferring module to remote 11683 1726853269.20085: _low_level_execute_command(): starting 11683 1726853269.20088: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/ /root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/AnsiballZ_command.py && sleep 0' 11683 1726853269.20515: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853269.20518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853269.20521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.20523: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853269.20525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.20575: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853269.20580: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.20645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.22480: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853269.22507: stderr chunk (state=3): >>><<< 11683 1726853269.22510: stdout chunk (state=3): >>><<< 11683 1726853269.22520: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853269.22585: _low_level_execute_command(): starting 11683 1726853269.22588: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/AnsiballZ_command.py && sleep 0' 11683 1726853269.22962: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853269.22965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853269.22968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853269.22972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.23026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853269.23034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853269.23037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.23098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.39036: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::86/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::a881:7bff:fecd:7a00/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::a881:7bff:fecd:7a00/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:27:49.385276", "end": "2024-09-20 13:27:49.389107", "delta": "0:00:00.003831", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853269.40667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853269.40693: stderr chunk (state=3): >>><<< 11683 1726853269.40697: stdout chunk (state=3): >>><<< 11683 1726853269.40713: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::86/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::a881:7bff:fecd:7a00/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::a881:7bff:fecd:7a00/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 13:27:49.385276", "end": "2024-09-20 13:27:49.389107", "delta": "0:00:00.003831", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853269.40745: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853269.40756: _low_level_execute_command(): starting 11683 1726853269.40760: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853269.1285598-12655-235914236734541/ > /dev/null 2>&1 && sleep 0' 11683 1726853269.41221: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853269.41225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.41227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853269.41229: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853269.41231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.41277: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853269.41295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853269.41302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.41356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.43262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853269.43287: stderr chunk (state=3): >>><<< 11683 1726853269.43290: stdout chunk (state=3): >>><<< 11683 1726853269.43303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853269.43310: handler run complete 11683 1726853269.43332: Evaluated conditional (False): False 11683 1726853269.43445: variable 'result' from source: set_fact 11683 1726853269.43460: Evaluated conditional ('2001' in result.stdout): True 11683 1726853269.43475: attempt loop complete, returning result 11683 1726853269.43479: _execute() done 11683 1726853269.43482: dumping result to json 11683 1726853269.43487: done dumping result, returning 11683 1726853269.43495: done running TaskExecutor() for managed_node3/TASK: ** TEST check IPv6 [02083763-bbaf-c5b2-e075-000000000073] 11683 1726853269.43499: sending task result for task 02083763-bbaf-c5b2-e075-000000000073 11683 1726853269.43594: done sending task result for task 02083763-bbaf-c5b2-e075-000000000073 11683 1726853269.43597: WORKER PROCESS EXITING ok: [managed_node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003831", "end": "2024-09-20 13:27:49.389107", "rc": 0, "start": "2024-09-20 13:27:49.385276" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::86/128 scope global dynamic noprefixroute valid_lft 236sec preferred_lft 236sec inet6 2001:db8::a881:7bff:fecd:7a00/64 scope global dynamic noprefixroute valid_lft 1796sec preferred_lft 1796sec inet6 fe80::a881:7bff:fecd:7a00/64 scope link noprefixroute valid_lft forever preferred_lft forever 11683 1726853269.43669: no more pending results, returning what we have 11683 1726853269.43675: results queue empty 11683 1726853269.43676: checking for any_errors_fatal 11683 1726853269.43683: done checking for any_errors_fatal 11683 1726853269.43684: checking for max_fail_percentage 11683 1726853269.43686: done checking for max_fail_percentage 11683 1726853269.43686: checking to see if all hosts have failed and the running result is not ok 11683 1726853269.43687: done checking to see if all hosts have failed 11683 1726853269.43688: getting the remaining hosts for this loop 11683 1726853269.43690: done getting the remaining hosts for this loop 11683 1726853269.43693: getting the next task for host managed_node3 11683 1726853269.43703: done getting next task for host managed_node3 11683 1726853269.43712: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11683 1726853269.43715: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853269.43734: getting variables 11683 1726853269.43735: in VariableManager get_vars() 11683 1726853269.43775: Calling all_inventory to load vars for managed_node3 11683 1726853269.43777: Calling groups_inventory to load vars for managed_node3 11683 1726853269.43779: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853269.43789: Calling all_plugins_play to load vars for managed_node3 11683 1726853269.43791: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853269.43793: Calling groups_plugins_play to load vars for managed_node3 11683 1726853269.44561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853269.45427: done with get_vars() 11683 1726853269.45449: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 13:27:49 -0400 (0:00:00.376) 0:00:22.527 ****** 11683 1726853269.45523: entering _queue_task() for managed_node3/include_tasks 11683 1726853269.45815: worker is 1 (out of 1 available) 11683 1726853269.45830: exiting _queue_task() for managed_node3/include_tasks 11683 1726853269.45839: done queuing things up, now waiting for results queue to drain 11683 1726853269.45841: waiting for pending results... 11683 1726853269.46039: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11683 1726853269.46144: in run() - task 02083763-bbaf-c5b2-e075-00000000007c 11683 1726853269.46158: variable 'ansible_search_path' from source: unknown 11683 1726853269.46162: variable 'ansible_search_path' from source: unknown 11683 1726853269.46194: calling self._execute() 11683 1726853269.46272: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.46276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.46286: variable 'omit' from source: magic vars 11683 1726853269.46556: variable 'ansible_distribution_major_version' from source: facts 11683 1726853269.46567: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853269.46574: _execute() done 11683 1726853269.46577: dumping result to json 11683 1726853269.46581: done dumping result, returning 11683 1726853269.46588: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [02083763-bbaf-c5b2-e075-00000000007c] 11683 1726853269.46593: sending task result for task 02083763-bbaf-c5b2-e075-00000000007c 11683 1726853269.46683: done sending task result for task 02083763-bbaf-c5b2-e075-00000000007c 11683 1726853269.46685: WORKER PROCESS EXITING 11683 1726853269.46758: no more pending results, returning what we have 11683 1726853269.46763: in VariableManager get_vars() 11683 1726853269.46816: Calling all_inventory to load vars for managed_node3 11683 1726853269.46819: Calling groups_inventory to load vars for managed_node3 11683 1726853269.46821: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853269.46831: Calling all_plugins_play to load vars for managed_node3 11683 1726853269.46833: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853269.46835: Calling groups_plugins_play to load vars for managed_node3 11683 1726853269.48200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853269.49118: done with get_vars() 11683 1726853269.49134: variable 'ansible_search_path' from source: unknown 11683 1726853269.49135: variable 'ansible_search_path' from source: unknown 11683 1726853269.49166: we have included files to process 11683 1726853269.49167: generating all_blocks data 11683 1726853269.49169: done generating all_blocks data 11683 1726853269.49174: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11683 1726853269.49175: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11683 1726853269.49176: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11683 1726853269.49557: done processing included file 11683 1726853269.49559: iterating over new_blocks loaded from include file 11683 1726853269.49560: in VariableManager get_vars() 11683 1726853269.49580: done with get_vars() 11683 1726853269.49581: filtering new block on tags 11683 1726853269.49600: done filtering new block on tags 11683 1726853269.49602: in VariableManager get_vars() 11683 1726853269.49616: done with get_vars() 11683 1726853269.49617: filtering new block on tags 11683 1726853269.49640: done filtering new block on tags 11683 1726853269.49642: in VariableManager get_vars() 11683 1726853269.49659: done with get_vars() 11683 1726853269.49661: filtering new block on tags 11683 1726853269.49691: done filtering new block on tags 11683 1726853269.49693: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node3 11683 1726853269.49697: extending task lists for all hosts with included blocks 11683 1726853269.50335: done extending task lists 11683 1726853269.50337: done processing included files 11683 1726853269.50337: results queue empty 11683 1726853269.50338: checking for any_errors_fatal 11683 1726853269.50342: done checking for any_errors_fatal 11683 1726853269.50342: checking for max_fail_percentage 11683 1726853269.50343: done checking for max_fail_percentage 11683 1726853269.50344: checking to see if all hosts have failed and the running result is not ok 11683 1726853269.50345: done checking to see if all hosts have failed 11683 1726853269.50346: getting the remaining hosts for this loop 11683 1726853269.50347: done getting the remaining hosts for this loop 11683 1726853269.50349: getting the next task for host managed_node3 11683 1726853269.50353: done getting next task for host managed_node3 11683 1726853269.50356: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11683 1726853269.50359: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853269.50368: getting variables 11683 1726853269.50369: in VariableManager get_vars() 11683 1726853269.50387: Calling all_inventory to load vars for managed_node3 11683 1726853269.50389: Calling groups_inventory to load vars for managed_node3 11683 1726853269.50391: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853269.50397: Calling all_plugins_play to load vars for managed_node3 11683 1726853269.50399: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853269.50402: Calling groups_plugins_play to load vars for managed_node3 11683 1726853269.52756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853269.54711: done with get_vars() 11683 1726853269.54736: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 13:27:49 -0400 (0:00:00.093) 0:00:22.620 ****** 11683 1726853269.54831: entering _queue_task() for managed_node3/setup 11683 1726853269.55285: worker is 1 (out of 1 available) 11683 1726853269.55299: exiting _queue_task() for managed_node3/setup 11683 1726853269.55310: done queuing things up, now waiting for results queue to drain 11683 1726853269.55311: waiting for pending results... 11683 1726853269.55583: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11683 1726853269.55777: in run() - task 02083763-bbaf-c5b2-e075-000000000491 11683 1726853269.55801: variable 'ansible_search_path' from source: unknown 11683 1726853269.55807: variable 'ansible_search_path' from source: unknown 11683 1726853269.55842: calling self._execute() 11683 1726853269.56279: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.56284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.56287: variable 'omit' from source: magic vars 11683 1726853269.56731: variable 'ansible_distribution_major_version' from source: facts 11683 1726853269.56752: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853269.56981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853269.59473: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853269.59570: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853269.59619: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853269.59669: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853269.59704: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853269.59794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853269.59833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853269.59876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853269.59959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853269.59963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853269.60012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853269.60037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853269.60072: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853269.60115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853269.60134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853269.60376: variable '__network_required_facts' from source: role '' defaults 11683 1726853269.60380: variable 'ansible_facts' from source: unknown 11683 1726853269.61512: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11683 1726853269.61593: when evaluation is False, skipping this task 11683 1726853269.61606: _execute() done 11683 1726853269.61615: dumping result to json 11683 1726853269.61623: done dumping result, returning 11683 1726853269.61636: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [02083763-bbaf-c5b2-e075-000000000491] 11683 1726853269.61648: sending task result for task 02083763-bbaf-c5b2-e075-000000000491 skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853269.62024: no more pending results, returning what we have 11683 1726853269.62029: results queue empty 11683 1726853269.62030: checking for any_errors_fatal 11683 1726853269.62032: done checking for any_errors_fatal 11683 1726853269.62032: checking for max_fail_percentage 11683 1726853269.62034: done checking for max_fail_percentage 11683 1726853269.62035: checking to see if all hosts have failed and the running result is not ok 11683 1726853269.62036: done checking to see if all hosts have failed 11683 1726853269.62037: getting the remaining hosts for this loop 11683 1726853269.62039: done getting the remaining hosts for this loop 11683 1726853269.62045: getting the next task for host managed_node3 11683 1726853269.62057: done getting next task for host managed_node3 11683 1726853269.62061: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11683 1726853269.62067: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853269.62089: getting variables 11683 1726853269.62092: in VariableManager get_vars() 11683 1726853269.62135: Calling all_inventory to load vars for managed_node3 11683 1726853269.62138: Calling groups_inventory to load vars for managed_node3 11683 1726853269.62141: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853269.62155: Calling all_plugins_play to load vars for managed_node3 11683 1726853269.62158: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853269.62161: Calling groups_plugins_play to load vars for managed_node3 11683 1726853269.62780: done sending task result for task 02083763-bbaf-c5b2-e075-000000000491 11683 1726853269.62784: WORKER PROCESS EXITING 11683 1726853269.65655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853269.68104: done with get_vars() 11683 1726853269.68134: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 13:27:49 -0400 (0:00:00.134) 0:00:22.754 ****** 11683 1726853269.68253: entering _queue_task() for managed_node3/stat 11683 1726853269.68604: worker is 1 (out of 1 available) 11683 1726853269.68618: exiting _queue_task() for managed_node3/stat 11683 1726853269.68630: done queuing things up, now waiting for results queue to drain 11683 1726853269.68631: waiting for pending results... 11683 1726853269.69057: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 11683 1726853269.69235: in run() - task 02083763-bbaf-c5b2-e075-000000000493 11683 1726853269.69263: variable 'ansible_search_path' from source: unknown 11683 1726853269.69275: variable 'ansible_search_path' from source: unknown 11683 1726853269.69318: calling self._execute() 11683 1726853269.69430: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.69441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.69464: variable 'omit' from source: magic vars 11683 1726853269.69845: variable 'ansible_distribution_major_version' from source: facts 11683 1726853269.69862: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853269.70038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853269.70334: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853269.70402: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853269.70505: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853269.70658: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853269.70662: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853269.70720: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853269.70756: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853269.70826: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853269.70930: variable '__network_is_ostree' from source: set_fact 11683 1726853269.70946: Evaluated conditional (not __network_is_ostree is defined): False 11683 1726853269.70955: when evaluation is False, skipping this task 11683 1726853269.70964: _execute() done 11683 1726853269.70986: dumping result to json 11683 1726853269.70999: done dumping result, returning 11683 1726853269.71012: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [02083763-bbaf-c5b2-e075-000000000493] 11683 1726853269.71023: sending task result for task 02083763-bbaf-c5b2-e075-000000000493 11683 1726853269.71277: done sending task result for task 02083763-bbaf-c5b2-e075-000000000493 11683 1726853269.71281: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11683 1726853269.71340: no more pending results, returning what we have 11683 1726853269.71347: results queue empty 11683 1726853269.71349: checking for any_errors_fatal 11683 1726853269.71355: done checking for any_errors_fatal 11683 1726853269.71356: checking for max_fail_percentage 11683 1726853269.71358: done checking for max_fail_percentage 11683 1726853269.71359: checking to see if all hosts have failed and the running result is not ok 11683 1726853269.71360: done checking to see if all hosts have failed 11683 1726853269.71361: getting the remaining hosts for this loop 11683 1726853269.71363: done getting the remaining hosts for this loop 11683 1726853269.71366: getting the next task for host managed_node3 11683 1726853269.71375: done getting next task for host managed_node3 11683 1726853269.71380: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11683 1726853269.71385: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853269.71405: getting variables 11683 1726853269.71407: in VariableManager get_vars() 11683 1726853269.71454: Calling all_inventory to load vars for managed_node3 11683 1726853269.71458: Calling groups_inventory to load vars for managed_node3 11683 1726853269.71460: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853269.71575: Calling all_plugins_play to load vars for managed_node3 11683 1726853269.71580: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853269.71584: Calling groups_plugins_play to load vars for managed_node3 11683 1726853269.73283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853269.74909: done with get_vars() 11683 1726853269.74938: done getting variables 11683 1726853269.75001: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 13:27:49 -0400 (0:00:00.067) 0:00:22.822 ****** 11683 1726853269.75040: entering _queue_task() for managed_node3/set_fact 11683 1726853269.75424: worker is 1 (out of 1 available) 11683 1726853269.75435: exiting _queue_task() for managed_node3/set_fact 11683 1726853269.75449: done queuing things up, now waiting for results queue to drain 11683 1726853269.75451: waiting for pending results... 11683 1726853269.75781: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11683 1726853269.75935: in run() - task 02083763-bbaf-c5b2-e075-000000000494 11683 1726853269.75952: variable 'ansible_search_path' from source: unknown 11683 1726853269.75955: variable 'ansible_search_path' from source: unknown 11683 1726853269.75991: calling self._execute() 11683 1726853269.76093: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.76097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.76112: variable 'omit' from source: magic vars 11683 1726853269.76676: variable 'ansible_distribution_major_version' from source: facts 11683 1726853269.76680: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853269.76683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853269.76933: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853269.76979: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853269.77016: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853269.77049: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853269.77135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853269.77160: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853269.77187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853269.77217: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853269.77378: variable '__network_is_ostree' from source: set_fact 11683 1726853269.77385: Evaluated conditional (not __network_is_ostree is defined): False 11683 1726853269.77394: when evaluation is False, skipping this task 11683 1726853269.77397: _execute() done 11683 1726853269.77426: dumping result to json 11683 1726853269.77431: done dumping result, returning 11683 1726853269.77441: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [02083763-bbaf-c5b2-e075-000000000494] 11683 1726853269.77447: sending task result for task 02083763-bbaf-c5b2-e075-000000000494 11683 1726853269.77529: done sending task result for task 02083763-bbaf-c5b2-e075-000000000494 11683 1726853269.77532: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11683 1726853269.77609: no more pending results, returning what we have 11683 1726853269.77613: results queue empty 11683 1726853269.77614: checking for any_errors_fatal 11683 1726853269.77620: done checking for any_errors_fatal 11683 1726853269.77621: checking for max_fail_percentage 11683 1726853269.77623: done checking for max_fail_percentage 11683 1726853269.77624: checking to see if all hosts have failed and the running result is not ok 11683 1726853269.77625: done checking to see if all hosts have failed 11683 1726853269.77626: getting the remaining hosts for this loop 11683 1726853269.77627: done getting the remaining hosts for this loop 11683 1726853269.77631: getting the next task for host managed_node3 11683 1726853269.77640: done getting next task for host managed_node3 11683 1726853269.77647: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11683 1726853269.77775: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853269.77792: getting variables 11683 1726853269.77794: in VariableManager get_vars() 11683 1726853269.77829: Calling all_inventory to load vars for managed_node3 11683 1726853269.77832: Calling groups_inventory to load vars for managed_node3 11683 1726853269.77834: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853269.77845: Calling all_plugins_play to load vars for managed_node3 11683 1726853269.77849: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853269.77852: Calling groups_plugins_play to load vars for managed_node3 11683 1726853269.80247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853269.82208: done with get_vars() 11683 1726853269.82238: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 13:27:49 -0400 (0:00:00.073) 0:00:22.895 ****** 11683 1726853269.82374: entering _queue_task() for managed_node3/service_facts 11683 1726853269.82766: worker is 1 (out of 1 available) 11683 1726853269.82955: exiting _queue_task() for managed_node3/service_facts 11683 1726853269.82965: done queuing things up, now waiting for results queue to drain 11683 1726853269.82966: waiting for pending results... 11683 1726853269.83590: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running 11683 1726853269.83678: in run() - task 02083763-bbaf-c5b2-e075-000000000496 11683 1726853269.83828: variable 'ansible_search_path' from source: unknown 11683 1726853269.83831: variable 'ansible_search_path' from source: unknown 11683 1726853269.83866: calling self._execute() 11683 1726853269.84093: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.84100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.84217: variable 'omit' from source: magic vars 11683 1726853269.85216: variable 'ansible_distribution_major_version' from source: facts 11683 1726853269.85227: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853269.85235: variable 'omit' from source: magic vars 11683 1726853269.85358: variable 'omit' from source: magic vars 11683 1726853269.85428: variable 'omit' from source: magic vars 11683 1726853269.85779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853269.85783: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853269.85785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853269.85788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853269.85790: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853269.85793: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853269.85795: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.85797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.85800: Set connection var ansible_shell_executable to /bin/sh 11683 1726853269.85803: Set connection var ansible_timeout to 10 11683 1726853269.85806: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853269.85808: Set connection var ansible_pipelining to False 11683 1726853269.85811: Set connection var ansible_shell_type to sh 11683 1726853269.85814: Set connection var ansible_connection to ssh 11683 1726853269.85825: variable 'ansible_shell_executable' from source: unknown 11683 1726853269.85828: variable 'ansible_connection' from source: unknown 11683 1726853269.85831: variable 'ansible_module_compression' from source: unknown 11683 1726853269.85833: variable 'ansible_shell_type' from source: unknown 11683 1726853269.85877: variable 'ansible_shell_executable' from source: unknown 11683 1726853269.85880: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853269.85884: variable 'ansible_pipelining' from source: unknown 11683 1726853269.85886: variable 'ansible_timeout' from source: unknown 11683 1726853269.85888: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853269.86096: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853269.86111: variable 'omit' from source: magic vars 11683 1726853269.86115: starting attempt loop 11683 1726853269.86117: running the handler 11683 1726853269.86130: _low_level_execute_command(): starting 11683 1726853269.86139: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853269.87218: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853269.87224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.87283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853269.87287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853269.87306: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.87401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.89127: stdout chunk (state=3): >>>/root <<< 11683 1726853269.89285: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853269.89289: stdout chunk (state=3): >>><<< 11683 1726853269.89292: stderr chunk (state=3): >>><<< 11683 1726853269.89310: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853269.89329: _low_level_execute_command(): starting 11683 1726853269.89340: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142 `" && echo ansible-tmp-1726853269.8931704-12697-29098413444142="` echo /root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142 `" ) && sleep 0' 11683 1726853269.90173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853269.90199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853269.90217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853269.90236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853269.90258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853269.90269: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853269.90309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.90416: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853269.90432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853269.90523: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853269.90548: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.90649: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.92648: stdout chunk (state=3): >>>ansible-tmp-1726853269.8931704-12697-29098413444142=/root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142 <<< 11683 1726853269.92889: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853269.92893: stdout chunk (state=3): >>><<< 11683 1726853269.92920: stderr chunk (state=3): >>><<< 11683 1726853269.92924: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853269.8931704-12697-29098413444142=/root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853269.92966: variable 'ansible_module_compression' from source: unknown 11683 1726853269.93030: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11683 1726853269.93052: variable 'ansible_facts' from source: unknown 11683 1726853269.93346: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/AnsiballZ_service_facts.py 11683 1726853269.93717: Sending initial data 11683 1726853269.93720: Sent initial data (161 bytes) 11683 1726853269.94995: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853269.95004: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853269.95018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853269.95039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853269.95134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853269.96787: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853269.96838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853269.96996: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp32jdviu6 /root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/AnsiballZ_service_facts.py <<< 11683 1726853269.97003: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/AnsiballZ_service_facts.py" <<< 11683 1726853269.97052: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp32jdviu6" to remote "/root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/AnsiballZ_service_facts.py" <<< 11683 1726853269.98492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853269.98563: stderr chunk (state=3): >>><<< 11683 1726853269.98567: stdout chunk (state=3): >>><<< 11683 1726853269.98590: done transferring module to remote 11683 1726853269.98608: _low_level_execute_command(): starting 11683 1726853269.98611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/ /root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/AnsiballZ_service_facts.py && sleep 0' 11683 1726853269.99838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853269.99845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853270.00092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853270.00163: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853270.02098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853270.02102: stderr chunk (state=3): >>><<< 11683 1726853270.02104: stdout chunk (state=3): >>><<< 11683 1726853270.02107: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853270.02109: _low_level_execute_command(): starting 11683 1726853270.02111: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/AnsiballZ_service_facts.py && sleep 0' 11683 1726853270.03457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853270.03461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853270.03464: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853270.03466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853270.03492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853270.03496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853270.03587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853270.03785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853271.62650: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 11683 1726853271.62657: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 11683 1726853271.62725: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integratio<<< 11683 1726853271.62792: stdout chunk (state=3): >>>n.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11683 1726853271.64289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853271.64372: stderr chunk (state=3): >>><<< 11683 1726853271.64399: stdout chunk (state=3): >>><<< 11683 1726853271.64590: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853271.66299: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853271.66307: _low_level_execute_command(): starting 11683 1726853271.66312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853269.8931704-12697-29098413444142/ > /dev/null 2>&1 && sleep 0' 11683 1726853271.67228: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853271.67235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853271.67313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853271.67320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853271.67336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853271.67342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853271.67356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853271.67700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853271.67705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853271.67802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853271.69690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853271.69715: stderr chunk (state=3): >>><<< 11683 1726853271.69718: stdout chunk (state=3): >>><<< 11683 1726853271.69733: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853271.69739: handler run complete 11683 1726853271.69860: variable 'ansible_facts' from source: unknown 11683 1726853271.69958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853271.70478: variable 'ansible_facts' from source: unknown 11683 1726853271.70511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853271.70733: attempt loop complete, returning result 11683 1726853271.70739: _execute() done 11683 1726853271.70741: dumping result to json 11683 1726853271.70813: done dumping result, returning 11683 1726853271.70823: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which services are running [02083763-bbaf-c5b2-e075-000000000496] 11683 1726853271.70826: sending task result for task 02083763-bbaf-c5b2-e075-000000000496 11683 1726853271.72218: done sending task result for task 02083763-bbaf-c5b2-e075-000000000496 11683 1726853271.72223: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853271.72344: no more pending results, returning what we have 11683 1726853271.72348: results queue empty 11683 1726853271.72349: checking for any_errors_fatal 11683 1726853271.72352: done checking for any_errors_fatal 11683 1726853271.72353: checking for max_fail_percentage 11683 1726853271.72354: done checking for max_fail_percentage 11683 1726853271.72355: checking to see if all hosts have failed and the running result is not ok 11683 1726853271.72356: done checking to see if all hosts have failed 11683 1726853271.72357: getting the remaining hosts for this loop 11683 1726853271.72358: done getting the remaining hosts for this loop 11683 1726853271.72361: getting the next task for host managed_node3 11683 1726853271.72367: done getting next task for host managed_node3 11683 1726853271.72398: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11683 1726853271.72404: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853271.72459: getting variables 11683 1726853271.72461: in VariableManager get_vars() 11683 1726853271.72506: Calling all_inventory to load vars for managed_node3 11683 1726853271.72509: Calling groups_inventory to load vars for managed_node3 11683 1726853271.72512: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853271.72521: Calling all_plugins_play to load vars for managed_node3 11683 1726853271.72523: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853271.72526: Calling groups_plugins_play to load vars for managed_node3 11683 1726853271.73479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853271.74702: done with get_vars() 11683 1726853271.74734: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 13:27:51 -0400 (0:00:01.925) 0:00:24.820 ****** 11683 1726853271.74898: entering _queue_task() for managed_node3/package_facts 11683 1726853271.75397: worker is 1 (out of 1 available) 11683 1726853271.75412: exiting _queue_task() for managed_node3/package_facts 11683 1726853271.75426: done queuing things up, now waiting for results queue to drain 11683 1726853271.75428: waiting for pending results... 11683 1726853271.75699: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 11683 1726853271.75904: in run() - task 02083763-bbaf-c5b2-e075-000000000497 11683 1726853271.75910: variable 'ansible_search_path' from source: unknown 11683 1726853271.75913: variable 'ansible_search_path' from source: unknown 11683 1726853271.75993: calling self._execute() 11683 1726853271.76076: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853271.76088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853271.76104: variable 'omit' from source: magic vars 11683 1726853271.76538: variable 'ansible_distribution_major_version' from source: facts 11683 1726853271.76552: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853271.76560: variable 'omit' from source: magic vars 11683 1726853271.76615: variable 'omit' from source: magic vars 11683 1726853271.76644: variable 'omit' from source: magic vars 11683 1726853271.76680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853271.76707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853271.76724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853271.76737: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853271.76747: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853271.76777: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853271.76780: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853271.76782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853271.76850: Set connection var ansible_shell_executable to /bin/sh 11683 1726853271.76898: Set connection var ansible_timeout to 10 11683 1726853271.76901: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853271.76903: Set connection var ansible_pipelining to False 11683 1726853271.76906: Set connection var ansible_shell_type to sh 11683 1726853271.76908: Set connection var ansible_connection to ssh 11683 1726853271.76925: variable 'ansible_shell_executable' from source: unknown 11683 1726853271.76928: variable 'ansible_connection' from source: unknown 11683 1726853271.76930: variable 'ansible_module_compression' from source: unknown 11683 1726853271.76933: variable 'ansible_shell_type' from source: unknown 11683 1726853271.76935: variable 'ansible_shell_executable' from source: unknown 11683 1726853271.76937: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853271.76941: variable 'ansible_pipelining' from source: unknown 11683 1726853271.76946: variable 'ansible_timeout' from source: unknown 11683 1726853271.76949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853271.77127: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853271.77138: variable 'omit' from source: magic vars 11683 1726853271.77145: starting attempt loop 11683 1726853271.77148: running the handler 11683 1726853271.77158: _low_level_execute_command(): starting 11683 1726853271.77164: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853271.77978: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853271.78067: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853271.78095: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853271.78192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853271.79895: stdout chunk (state=3): >>>/root <<< 11683 1726853271.80050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853271.80054: stdout chunk (state=3): >>><<< 11683 1726853271.80056: stderr chunk (state=3): >>><<< 11683 1726853271.80186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853271.80190: _low_level_execute_command(): starting 11683 1726853271.80194: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959 `" && echo ansible-tmp-1726853271.8008878-12764-97713183893959="` echo /root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959 `" ) && sleep 0' 11683 1726853271.80897: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853271.81001: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853271.81058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853271.81089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853271.81179: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853271.81196: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853271.81273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853271.81326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853271.81357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853271.81518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853271.83533: stdout chunk (state=3): >>>ansible-tmp-1726853271.8008878-12764-97713183893959=/root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959 <<< 11683 1726853271.83740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853271.83752: stdout chunk (state=3): >>><<< 11683 1726853271.83758: stderr chunk (state=3): >>><<< 11683 1726853271.83985: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853271.8008878-12764-97713183893959=/root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853271.83988: variable 'ansible_module_compression' from source: unknown 11683 1726853271.83990: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11683 1726853271.84078: variable 'ansible_facts' from source: unknown 11683 1726853271.84309: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/AnsiballZ_package_facts.py 11683 1726853271.84554: Sending initial data 11683 1726853271.84557: Sent initial data (161 bytes) 11683 1726853271.85257: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853271.85411: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853271.85468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853271.85495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853271.85621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853271.87299: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853271.87360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853271.87440: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpysavfeo1 /root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/AnsiballZ_package_facts.py <<< 11683 1726853271.87450: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/AnsiballZ_package_facts.py" <<< 11683 1726853271.87492: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpysavfeo1" to remote "/root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/AnsiballZ_package_facts.py" <<< 11683 1726853271.88761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853271.88801: stderr chunk (state=3): >>><<< 11683 1726853271.88846: stdout chunk (state=3): >>><<< 11683 1726853271.88850: done transferring module to remote 11683 1726853271.88852: _low_level_execute_command(): starting 11683 1726853271.88854: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/ /root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/AnsiballZ_package_facts.py && sleep 0' 11683 1726853271.89274: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853271.89312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853271.89316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853271.89318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853271.89321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853271.89365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853271.89368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853271.89435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853271.91310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853271.91333: stderr chunk (state=3): >>><<< 11683 1726853271.91336: stdout chunk (state=3): >>><<< 11683 1726853271.91351: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853271.91354: _low_level_execute_command(): starting 11683 1726853271.91360: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/AnsiballZ_package_facts.py && sleep 0' 11683 1726853271.91795: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853271.91801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853271.91824: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853271.91875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853271.91880: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853271.91882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853271.91952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853272.37280: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 11683 1726853272.37312: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11683 1726853272.39031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853272.39095: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 11683 1726853272.39112: stderr chunk (state=3): >>><<< 11683 1726853272.39126: stdout chunk (state=3): >>><<< 11683 1726853272.39161: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853272.41689: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853272.41694: _low_level_execute_command(): starting 11683 1726853272.41696: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853271.8008878-12764-97713183893959/ > /dev/null 2>&1 && sleep 0' 11683 1726853272.42290: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853272.42307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853272.42329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853272.42392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853272.42454: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853272.42469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853272.42507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853272.42670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853272.44778: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853272.44781: stdout chunk (state=3): >>><<< 11683 1726853272.44783: stderr chunk (state=3): >>><<< 11683 1726853272.44785: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853272.44787: handler run complete 11683 1726853272.45781: variable 'ansible_facts' from source: unknown 11683 1726853272.46278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853272.48416: variable 'ansible_facts' from source: unknown 11683 1726853272.48822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853272.50124: attempt loop complete, returning result 11683 1726853272.50138: _execute() done 11683 1726853272.50141: dumping result to json 11683 1726853272.50555: done dumping result, returning 11683 1726853272.50565: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [02083763-bbaf-c5b2-e075-000000000497] 11683 1726853272.50570: sending task result for task 02083763-bbaf-c5b2-e075-000000000497 11683 1726853272.54186: done sending task result for task 02083763-bbaf-c5b2-e075-000000000497 11683 1726853272.54190: WORKER PROCESS EXITING ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853272.54335: no more pending results, returning what we have 11683 1726853272.54338: results queue empty 11683 1726853272.54339: checking for any_errors_fatal 11683 1726853272.54344: done checking for any_errors_fatal 11683 1726853272.54345: checking for max_fail_percentage 11683 1726853272.54346: done checking for max_fail_percentage 11683 1726853272.54347: checking to see if all hosts have failed and the running result is not ok 11683 1726853272.54348: done checking to see if all hosts have failed 11683 1726853272.54349: getting the remaining hosts for this loop 11683 1726853272.54350: done getting the remaining hosts for this loop 11683 1726853272.54353: getting the next task for host managed_node3 11683 1726853272.54361: done getting next task for host managed_node3 11683 1726853272.54364: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11683 1726853272.54369: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853272.54500: getting variables 11683 1726853272.54502: in VariableManager get_vars() 11683 1726853272.54540: Calling all_inventory to load vars for managed_node3 11683 1726853272.54543: Calling groups_inventory to load vars for managed_node3 11683 1726853272.54545: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853272.54554: Calling all_plugins_play to load vars for managed_node3 11683 1726853272.54557: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853272.54560: Calling groups_plugins_play to load vars for managed_node3 11683 1726853272.57294: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853272.60998: done with get_vars() 11683 1726853272.61033: done getting variables 11683 1726853272.61213: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 13:27:52 -0400 (0:00:00.863) 0:00:25.684 ****** 11683 1726853272.61255: entering _queue_task() for managed_node3/debug 11683 1726853272.62110: worker is 1 (out of 1 available) 11683 1726853272.62123: exiting _queue_task() for managed_node3/debug 11683 1726853272.62135: done queuing things up, now waiting for results queue to drain 11683 1726853272.62136: waiting for pending results... 11683 1726853272.63196: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider 11683 1726853272.63578: in run() - task 02083763-bbaf-c5b2-e075-00000000007d 11683 1726853272.63583: variable 'ansible_search_path' from source: unknown 11683 1726853272.63586: variable 'ansible_search_path' from source: unknown 11683 1726853272.63589: calling self._execute() 11683 1726853272.63592: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853272.63594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853272.63598: variable 'omit' from source: magic vars 11683 1726853272.64743: variable 'ansible_distribution_major_version' from source: facts 11683 1726853272.64879: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853272.64892: variable 'omit' from source: magic vars 11683 1726853272.65079: variable 'omit' from source: magic vars 11683 1726853272.65302: variable 'network_provider' from source: set_fact 11683 1726853272.65328: variable 'omit' from source: magic vars 11683 1726853272.65478: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853272.65525: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853272.65560: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853272.65625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853272.65645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853272.65738: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853272.65748: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853272.65980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853272.65990: Set connection var ansible_shell_executable to /bin/sh 11683 1726853272.66008: Set connection var ansible_timeout to 10 11683 1726853272.66099: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853272.66110: Set connection var ansible_pipelining to False 11683 1726853272.66118: Set connection var ansible_shell_type to sh 11683 1726853272.66124: Set connection var ansible_connection to ssh 11683 1726853272.66155: variable 'ansible_shell_executable' from source: unknown 11683 1726853272.66187: variable 'ansible_connection' from source: unknown 11683 1726853272.66201: variable 'ansible_module_compression' from source: unknown 11683 1726853272.66289: variable 'ansible_shell_type' from source: unknown 11683 1726853272.66299: variable 'ansible_shell_executable' from source: unknown 11683 1726853272.66309: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853272.66476: variable 'ansible_pipelining' from source: unknown 11683 1726853272.66479: variable 'ansible_timeout' from source: unknown 11683 1726853272.66481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853272.66907: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853272.67179: variable 'omit' from source: magic vars 11683 1726853272.67182: starting attempt loop 11683 1726853272.67185: running the handler 11683 1726853272.67187: handler run complete 11683 1726853272.67190: attempt loop complete, returning result 11683 1726853272.67192: _execute() done 11683 1726853272.67194: dumping result to json 11683 1726853272.67196: done dumping result, returning 11683 1726853272.67198: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Print network provider [02083763-bbaf-c5b2-e075-00000000007d] 11683 1726853272.67505: sending task result for task 02083763-bbaf-c5b2-e075-00000000007d 11683 1726853272.67735: done sending task result for task 02083763-bbaf-c5b2-e075-00000000007d 11683 1726853272.67740: WORKER PROCESS EXITING ok: [managed_node3] => {} MSG: Using network provider: nm 11683 1726853272.67847: no more pending results, returning what we have 11683 1726853272.67851: results queue empty 11683 1726853272.67852: checking for any_errors_fatal 11683 1726853272.67860: done checking for any_errors_fatal 11683 1726853272.67861: checking for max_fail_percentage 11683 1726853272.67862: done checking for max_fail_percentage 11683 1726853272.67863: checking to see if all hosts have failed and the running result is not ok 11683 1726853272.67864: done checking to see if all hosts have failed 11683 1726853272.67865: getting the remaining hosts for this loop 11683 1726853272.67866: done getting the remaining hosts for this loop 11683 1726853272.67869: getting the next task for host managed_node3 11683 1726853272.67879: done getting next task for host managed_node3 11683 1726853272.67883: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11683 1726853272.67887: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853272.67899: getting variables 11683 1726853272.67901: in VariableManager get_vars() 11683 1726853272.67939: Calling all_inventory to load vars for managed_node3 11683 1726853272.67941: Calling groups_inventory to load vars for managed_node3 11683 1726853272.67946: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853272.67956: Calling all_plugins_play to load vars for managed_node3 11683 1726853272.67959: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853272.67961: Calling groups_plugins_play to load vars for managed_node3 11683 1726853272.87855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853272.91259: done with get_vars() 11683 1726853272.91411: done getting variables 11683 1726853272.91462: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 13:27:52 -0400 (0:00:00.303) 0:00:25.987 ****** 11683 1726853272.91599: entering _queue_task() for managed_node3/fail 11683 1726853272.92664: worker is 1 (out of 1 available) 11683 1726853272.92679: exiting _queue_task() for managed_node3/fail 11683 1726853272.92691: done queuing things up, now waiting for results queue to drain 11683 1726853272.92693: waiting for pending results... 11683 1726853272.93569: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11683 1726853272.93993: in run() - task 02083763-bbaf-c5b2-e075-00000000007e 11683 1726853272.93998: variable 'ansible_search_path' from source: unknown 11683 1726853272.94002: variable 'ansible_search_path' from source: unknown 11683 1726853272.94225: calling self._execute() 11683 1726853272.94538: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853272.94543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853272.94628: variable 'omit' from source: magic vars 11683 1726853272.95778: variable 'ansible_distribution_major_version' from source: facts 11683 1726853272.95783: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853272.96037: variable 'network_state' from source: role '' defaults 11683 1726853272.96157: Evaluated conditional (network_state != {}): False 11683 1726853272.96164: when evaluation is False, skipping this task 11683 1726853272.96167: _execute() done 11683 1726853272.96170: dumping result to json 11683 1726853272.96234: done dumping result, returning 11683 1726853272.96238: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [02083763-bbaf-c5b2-e075-00000000007e] 11683 1726853272.96240: sending task result for task 02083763-bbaf-c5b2-e075-00000000007e 11683 1726853272.96312: done sending task result for task 02083763-bbaf-c5b2-e075-00000000007e 11683 1726853272.96315: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853272.96421: no more pending results, returning what we have 11683 1726853272.96426: results queue empty 11683 1726853272.96427: checking for any_errors_fatal 11683 1726853272.96435: done checking for any_errors_fatal 11683 1726853272.96436: checking for max_fail_percentage 11683 1726853272.96438: done checking for max_fail_percentage 11683 1726853272.96439: checking to see if all hosts have failed and the running result is not ok 11683 1726853272.96440: done checking to see if all hosts have failed 11683 1726853272.96442: getting the remaining hosts for this loop 11683 1726853272.96446: done getting the remaining hosts for this loop 11683 1726853272.96450: getting the next task for host managed_node3 11683 1726853272.96458: done getting next task for host managed_node3 11683 1726853272.96462: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11683 1726853272.96467: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853272.96492: getting variables 11683 1726853272.96494: in VariableManager get_vars() 11683 1726853272.96537: Calling all_inventory to load vars for managed_node3 11683 1726853272.96539: Calling groups_inventory to load vars for managed_node3 11683 1726853272.96542: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853272.96557: Calling all_plugins_play to load vars for managed_node3 11683 1726853272.96559: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853272.96562: Calling groups_plugins_play to load vars for managed_node3 11683 1726853272.99632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.04324: done with get_vars() 11683 1726853273.04358: done getting variables 11683 1726853273.04574: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 13:27:53 -0400 (0:00:00.130) 0:00:26.118 ****** 11683 1726853273.04612: entering _queue_task() for managed_node3/fail 11683 1726853273.05025: worker is 1 (out of 1 available) 11683 1726853273.05039: exiting _queue_task() for managed_node3/fail 11683 1726853273.05053: done queuing things up, now waiting for results queue to drain 11683 1726853273.05054: waiting for pending results... 11683 1726853273.05500: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11683 1726853273.05506: in run() - task 02083763-bbaf-c5b2-e075-00000000007f 11683 1726853273.05546: variable 'ansible_search_path' from source: unknown 11683 1726853273.05550: variable 'ansible_search_path' from source: unknown 11683 1726853273.05653: calling self._execute() 11683 1726853273.05665: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.05687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.05695: variable 'omit' from source: magic vars 11683 1726853273.06094: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.06111: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.06233: variable 'network_state' from source: role '' defaults 11683 1726853273.06246: Evaluated conditional (network_state != {}): False 11683 1726853273.06249: when evaluation is False, skipping this task 11683 1726853273.06252: _execute() done 11683 1726853273.06255: dumping result to json 11683 1726853273.06377: done dumping result, returning 11683 1726853273.06382: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [02083763-bbaf-c5b2-e075-00000000007f] 11683 1726853273.06385: sending task result for task 02083763-bbaf-c5b2-e075-00000000007f 11683 1726853273.06589: done sending task result for task 02083763-bbaf-c5b2-e075-00000000007f 11683 1726853273.06593: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853273.06655: no more pending results, returning what we have 11683 1726853273.06659: results queue empty 11683 1726853273.06660: checking for any_errors_fatal 11683 1726853273.06668: done checking for any_errors_fatal 11683 1726853273.06669: checking for max_fail_percentage 11683 1726853273.06673: done checking for max_fail_percentage 11683 1726853273.06674: checking to see if all hosts have failed and the running result is not ok 11683 1726853273.06675: done checking to see if all hosts have failed 11683 1726853273.06676: getting the remaining hosts for this loop 11683 1726853273.06678: done getting the remaining hosts for this loop 11683 1726853273.06681: getting the next task for host managed_node3 11683 1726853273.06690: done getting next task for host managed_node3 11683 1726853273.06694: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11683 1726853273.06700: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853273.06723: getting variables 11683 1726853273.06726: in VariableManager get_vars() 11683 1726853273.06991: Calling all_inventory to load vars for managed_node3 11683 1726853273.06999: Calling groups_inventory to load vars for managed_node3 11683 1726853273.07006: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853273.07018: Calling all_plugins_play to load vars for managed_node3 11683 1726853273.07022: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853273.07026: Calling groups_plugins_play to load vars for managed_node3 11683 1726853273.10647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.11814: done with get_vars() 11683 1726853273.11846: done getting variables 11683 1726853273.11922: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 13:27:53 -0400 (0:00:00.073) 0:00:26.191 ****** 11683 1726853273.11962: entering _queue_task() for managed_node3/fail 11683 1726853273.12368: worker is 1 (out of 1 available) 11683 1726853273.12384: exiting _queue_task() for managed_node3/fail 11683 1726853273.12510: done queuing things up, now waiting for results queue to drain 11683 1726853273.12512: waiting for pending results... 11683 1726853273.12710: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11683 1726853273.12818: in run() - task 02083763-bbaf-c5b2-e075-000000000080 11683 1726853273.12868: variable 'ansible_search_path' from source: unknown 11683 1726853273.12873: variable 'ansible_search_path' from source: unknown 11683 1726853273.12877: calling self._execute() 11683 1726853273.12963: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.12974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.12981: variable 'omit' from source: magic vars 11683 1726853273.13387: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.13391: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.13549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853273.16322: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853273.16342: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853273.16420: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853273.16445: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853273.16473: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853273.16534: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.16557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.16579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.16611: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.16670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.16735: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.16742: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11683 1726853273.16867: variable 'ansible_distribution' from source: facts 11683 1726853273.16872: variable '__network_rh_distros' from source: role '' defaults 11683 1726853273.16883: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11683 1726853273.17047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.17062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.17080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.17118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.17125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.17164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.17181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.17197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.17221: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.17232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.17277: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.17296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.17313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.17342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.17378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.17594: variable 'network_connections' from source: task vars 11683 1726853273.17604: variable 'port2_profile' from source: play vars 11683 1726853273.17664: variable 'port2_profile' from source: play vars 11683 1726853273.17674: variable 'port1_profile' from source: play vars 11683 1726853273.17722: variable 'port1_profile' from source: play vars 11683 1726853273.17725: variable 'controller_profile' from source: play vars 11683 1726853273.17765: variable 'controller_profile' from source: play vars 11683 1726853273.17774: variable 'network_state' from source: role '' defaults 11683 1726853273.17841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853273.18014: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853273.18032: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853273.18058: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853273.18082: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853273.18151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853273.18159: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853273.18181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.18199: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853273.18239: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11683 1726853273.18244: when evaluation is False, skipping this task 11683 1726853273.18247: _execute() done 11683 1726853273.18255: dumping result to json 11683 1726853273.18258: done dumping result, returning 11683 1726853273.18261: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [02083763-bbaf-c5b2-e075-000000000080] 11683 1726853273.18263: sending task result for task 02083763-bbaf-c5b2-e075-000000000080 11683 1726853273.18353: done sending task result for task 02083763-bbaf-c5b2-e075-000000000080 11683 1726853273.18356: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11683 1726853273.18445: no more pending results, returning what we have 11683 1726853273.18449: results queue empty 11683 1726853273.18450: checking for any_errors_fatal 11683 1726853273.18456: done checking for any_errors_fatal 11683 1726853273.18457: checking for max_fail_percentage 11683 1726853273.18459: done checking for max_fail_percentage 11683 1726853273.18459: checking to see if all hosts have failed and the running result is not ok 11683 1726853273.18461: done checking to see if all hosts have failed 11683 1726853273.18461: getting the remaining hosts for this loop 11683 1726853273.18463: done getting the remaining hosts for this loop 11683 1726853273.18467: getting the next task for host managed_node3 11683 1726853273.18475: done getting next task for host managed_node3 11683 1726853273.18480: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11683 1726853273.18483: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853273.18501: getting variables 11683 1726853273.18503: in VariableManager get_vars() 11683 1726853273.18539: Calling all_inventory to load vars for managed_node3 11683 1726853273.18541: Calling groups_inventory to load vars for managed_node3 11683 1726853273.18545: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853273.18554: Calling all_plugins_play to load vars for managed_node3 11683 1726853273.18556: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853273.18558: Calling groups_plugins_play to load vars for managed_node3 11683 1726853273.19840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.21366: done with get_vars() 11683 1726853273.21384: done getting variables 11683 1726853273.21430: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 13:27:53 -0400 (0:00:00.094) 0:00:26.286 ****** 11683 1726853273.21455: entering _queue_task() for managed_node3/dnf 11683 1726853273.21722: worker is 1 (out of 1 available) 11683 1726853273.21735: exiting _queue_task() for managed_node3/dnf 11683 1726853273.21747: done queuing things up, now waiting for results queue to drain 11683 1726853273.21748: waiting for pending results... 11683 1726853273.21949: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11683 1726853273.22078: in run() - task 02083763-bbaf-c5b2-e075-000000000081 11683 1726853273.22090: variable 'ansible_search_path' from source: unknown 11683 1726853273.22094: variable 'ansible_search_path' from source: unknown 11683 1726853273.22125: calling self._execute() 11683 1726853273.22203: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.22209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.22226: variable 'omit' from source: magic vars 11683 1726853273.22690: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.22694: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.22896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853273.24879: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853273.24944: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853273.24998: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853273.25026: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853273.25045: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853273.25151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.25155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.25200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.25214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.25227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.25333: variable 'ansible_distribution' from source: facts 11683 1726853273.25339: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.25352: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11683 1726853273.25432: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853273.25533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.25563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.25585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.25610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.25624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.25660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.25677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.25711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.25736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.25755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.25784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.25804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.25823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.25848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.25859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.25961: variable 'network_connections' from source: task vars 11683 1726853273.25975: variable 'port2_profile' from source: play vars 11683 1726853273.26034: variable 'port2_profile' from source: play vars 11683 1726853273.26037: variable 'port1_profile' from source: play vars 11683 1726853273.26080: variable 'port1_profile' from source: play vars 11683 1726853273.26088: variable 'controller_profile' from source: play vars 11683 1726853273.26132: variable 'controller_profile' from source: play vars 11683 1726853273.26179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853273.26306: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853273.26364: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853273.26374: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853273.26402: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853273.26438: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853273.26476: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853273.26503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.26515: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853273.26563: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853273.26725: variable 'network_connections' from source: task vars 11683 1726853273.26729: variable 'port2_profile' from source: play vars 11683 1726853273.26774: variable 'port2_profile' from source: play vars 11683 1726853273.26781: variable 'port1_profile' from source: play vars 11683 1726853273.26823: variable 'port1_profile' from source: play vars 11683 1726853273.26829: variable 'controller_profile' from source: play vars 11683 1726853273.26897: variable 'controller_profile' from source: play vars 11683 1726853273.26913: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11683 1726853273.26918: when evaluation is False, skipping this task 11683 1726853273.26921: _execute() done 11683 1726853273.26923: dumping result to json 11683 1726853273.26925: done dumping result, returning 11683 1726853273.26934: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [02083763-bbaf-c5b2-e075-000000000081] 11683 1726853273.26937: sending task result for task 02083763-bbaf-c5b2-e075-000000000081 11683 1726853273.27036: done sending task result for task 02083763-bbaf-c5b2-e075-000000000081 11683 1726853273.27039: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11683 1726853273.27094: no more pending results, returning what we have 11683 1726853273.27097: results queue empty 11683 1726853273.27098: checking for any_errors_fatal 11683 1726853273.27106: done checking for any_errors_fatal 11683 1726853273.27107: checking for max_fail_percentage 11683 1726853273.27109: done checking for max_fail_percentage 11683 1726853273.27109: checking to see if all hosts have failed and the running result is not ok 11683 1726853273.27110: done checking to see if all hosts have failed 11683 1726853273.27111: getting the remaining hosts for this loop 11683 1726853273.27112: done getting the remaining hosts for this loop 11683 1726853273.27115: getting the next task for host managed_node3 11683 1726853273.27129: done getting next task for host managed_node3 11683 1726853273.27135: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11683 1726853273.27139: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853273.27160: getting variables 11683 1726853273.27161: in VariableManager get_vars() 11683 1726853273.27200: Calling all_inventory to load vars for managed_node3 11683 1726853273.27202: Calling groups_inventory to load vars for managed_node3 11683 1726853273.27204: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853273.27213: Calling all_plugins_play to load vars for managed_node3 11683 1726853273.27215: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853273.27217: Calling groups_plugins_play to load vars for managed_node3 11683 1726853273.28131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.29017: done with get_vars() 11683 1726853273.29033: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11683 1726853273.29090: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 13:27:53 -0400 (0:00:00.076) 0:00:26.363 ****** 11683 1726853273.29114: entering _queue_task() for managed_node3/yum 11683 1726853273.29391: worker is 1 (out of 1 available) 11683 1726853273.29407: exiting _queue_task() for managed_node3/yum 11683 1726853273.29418: done queuing things up, now waiting for results queue to drain 11683 1726853273.29420: waiting for pending results... 11683 1726853273.29608: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11683 1726853273.29714: in run() - task 02083763-bbaf-c5b2-e075-000000000082 11683 1726853273.29725: variable 'ansible_search_path' from source: unknown 11683 1726853273.29728: variable 'ansible_search_path' from source: unknown 11683 1726853273.29763: calling self._execute() 11683 1726853273.29838: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.29842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.29853: variable 'omit' from source: magic vars 11683 1726853273.30132: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.30142: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.30321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853273.32536: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853273.32687: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853273.32705: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853273.32734: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853273.32756: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853273.32816: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.32839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.32858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.32885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.32895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.32966: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.32981: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11683 1726853273.32984: when evaluation is False, skipping this task 11683 1726853273.32986: _execute() done 11683 1726853273.32989: dumping result to json 11683 1726853273.32993: done dumping result, returning 11683 1726853273.33000: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [02083763-bbaf-c5b2-e075-000000000082] 11683 1726853273.33003: sending task result for task 02083763-bbaf-c5b2-e075-000000000082 11683 1726853273.33098: done sending task result for task 02083763-bbaf-c5b2-e075-000000000082 11683 1726853273.33101: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11683 1726853273.33190: no more pending results, returning what we have 11683 1726853273.33194: results queue empty 11683 1726853273.33195: checking for any_errors_fatal 11683 1726853273.33201: done checking for any_errors_fatal 11683 1726853273.33202: checking for max_fail_percentage 11683 1726853273.33204: done checking for max_fail_percentage 11683 1726853273.33204: checking to see if all hosts have failed and the running result is not ok 11683 1726853273.33205: done checking to see if all hosts have failed 11683 1726853273.33206: getting the remaining hosts for this loop 11683 1726853273.33207: done getting the remaining hosts for this loop 11683 1726853273.33212: getting the next task for host managed_node3 11683 1726853273.33218: done getting next task for host managed_node3 11683 1726853273.33221: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11683 1726853273.33225: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853273.33241: getting variables 11683 1726853273.33245: in VariableManager get_vars() 11683 1726853273.33282: Calling all_inventory to load vars for managed_node3 11683 1726853273.33285: Calling groups_inventory to load vars for managed_node3 11683 1726853273.33287: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853273.33296: Calling all_plugins_play to load vars for managed_node3 11683 1726853273.33298: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853273.33301: Calling groups_plugins_play to load vars for managed_node3 11683 1726853273.34868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.36087: done with get_vars() 11683 1726853273.36104: done getting variables 11683 1726853273.36149: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 13:27:53 -0400 (0:00:00.070) 0:00:26.433 ****** 11683 1726853273.36178: entering _queue_task() for managed_node3/fail 11683 1726853273.36433: worker is 1 (out of 1 available) 11683 1726853273.36448: exiting _queue_task() for managed_node3/fail 11683 1726853273.36461: done queuing things up, now waiting for results queue to drain 11683 1726853273.36462: waiting for pending results... 11683 1726853273.36647: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11683 1726853273.36747: in run() - task 02083763-bbaf-c5b2-e075-000000000083 11683 1726853273.36756: variable 'ansible_search_path' from source: unknown 11683 1726853273.36760: variable 'ansible_search_path' from source: unknown 11683 1726853273.36819: calling self._execute() 11683 1726853273.36878: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.36882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.36891: variable 'omit' from source: magic vars 11683 1726853273.37175: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.37186: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.37286: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853273.37408: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853273.39387: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853273.39414: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853273.39451: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853273.39486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853273.39512: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853273.39589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.39719: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.39723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.39725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.39728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.39734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.39759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.39785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.39828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.39836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.39877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.39900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.39930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.39958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.39973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.40140: variable 'network_connections' from source: task vars 11683 1726853273.40157: variable 'port2_profile' from source: play vars 11683 1726853273.40266: variable 'port2_profile' from source: play vars 11683 1726853273.40269: variable 'port1_profile' from source: play vars 11683 1726853273.40277: variable 'port1_profile' from source: play vars 11683 1726853273.40286: variable 'controller_profile' from source: play vars 11683 1726853273.40336: variable 'controller_profile' from source: play vars 11683 1726853273.40502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853273.40723: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853273.40768: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853273.40875: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853273.40878: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853273.40900: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853273.40945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853273.40980: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.41011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853273.41085: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853273.41377: variable 'network_connections' from source: task vars 11683 1726853273.41388: variable 'port2_profile' from source: play vars 11683 1726853273.41454: variable 'port2_profile' from source: play vars 11683 1726853273.41827: variable 'port1_profile' from source: play vars 11683 1726853273.41830: variable 'port1_profile' from source: play vars 11683 1726853273.41847: variable 'controller_profile' from source: play vars 11683 1726853273.41941: variable 'controller_profile' from source: play vars 11683 1726853273.41995: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11683 1726853273.42017: when evaluation is False, skipping this task 11683 1726853273.42048: _execute() done 11683 1726853273.42056: dumping result to json 11683 1726853273.42059: done dumping result, returning 11683 1726853273.42077: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [02083763-bbaf-c5b2-e075-000000000083] 11683 1726853273.42128: sending task result for task 02083763-bbaf-c5b2-e075-000000000083 11683 1726853273.42217: done sending task result for task 02083763-bbaf-c5b2-e075-000000000083 11683 1726853273.42221: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11683 1726853273.42375: no more pending results, returning what we have 11683 1726853273.42378: results queue empty 11683 1726853273.42385: checking for any_errors_fatal 11683 1726853273.42392: done checking for any_errors_fatal 11683 1726853273.42393: checking for max_fail_percentage 11683 1726853273.42395: done checking for max_fail_percentage 11683 1726853273.42395: checking to see if all hosts have failed and the running result is not ok 11683 1726853273.42397: done checking to see if all hosts have failed 11683 1726853273.42397: getting the remaining hosts for this loop 11683 1726853273.42400: done getting the remaining hosts for this loop 11683 1726853273.42403: getting the next task for host managed_node3 11683 1726853273.42411: done getting next task for host managed_node3 11683 1726853273.42415: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11683 1726853273.42420: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853273.42438: getting variables 11683 1726853273.42440: in VariableManager get_vars() 11683 1726853273.42692: Calling all_inventory to load vars for managed_node3 11683 1726853273.42695: Calling groups_inventory to load vars for managed_node3 11683 1726853273.42698: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853273.42708: Calling all_plugins_play to load vars for managed_node3 11683 1726853273.42710: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853273.42713: Calling groups_plugins_play to load vars for managed_node3 11683 1726853273.44353: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.45261: done with get_vars() 11683 1726853273.45283: done getting variables 11683 1726853273.45334: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 13:27:53 -0400 (0:00:00.092) 0:00:26.525 ****** 11683 1726853273.45390: entering _queue_task() for managed_node3/package 11683 1726853273.45888: worker is 1 (out of 1 available) 11683 1726853273.45900: exiting _queue_task() for managed_node3/package 11683 1726853273.45911: done queuing things up, now waiting for results queue to drain 11683 1726853273.45912: waiting for pending results... 11683 1726853273.46291: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages 11683 1726853273.46351: in run() - task 02083763-bbaf-c5b2-e075-000000000084 11683 1726853273.46378: variable 'ansible_search_path' from source: unknown 11683 1726853273.46392: variable 'ansible_search_path' from source: unknown 11683 1726853273.46433: calling self._execute() 11683 1726853273.46544: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.46557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.46573: variable 'omit' from source: magic vars 11683 1726853273.47038: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.47042: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.47240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853273.47545: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853273.47677: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853273.47681: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853273.47725: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853273.47847: variable 'network_packages' from source: role '' defaults 11683 1726853273.47962: variable '__network_provider_setup' from source: role '' defaults 11683 1726853273.47981: variable '__network_service_name_default_nm' from source: role '' defaults 11683 1726853273.48053: variable '__network_service_name_default_nm' from source: role '' defaults 11683 1726853273.48066: variable '__network_packages_default_nm' from source: role '' defaults 11683 1726853273.48134: variable '__network_packages_default_nm' from source: role '' defaults 11683 1726853273.48326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853273.50762: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853273.50949: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853273.50953: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853273.50955: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853273.50992: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853273.51084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.51114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.51139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.51189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.51205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.51250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.51274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.51309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.51348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.51359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.51628: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11683 1726853273.51785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.51920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.51985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.52031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.52036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.52252: variable 'ansible_python' from source: facts 11683 1726853273.52409: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11683 1726853273.52662: variable '__network_wpa_supplicant_required' from source: role '' defaults 11683 1726853273.52897: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11683 1726853273.53230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.53233: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.53247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.53396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.53409: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.53519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.53664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.53688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.53728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.53741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.54128: variable 'network_connections' from source: task vars 11683 1726853273.54134: variable 'port2_profile' from source: play vars 11683 1726853273.54357: variable 'port2_profile' from source: play vars 11683 1726853273.54368: variable 'port1_profile' from source: play vars 11683 1726853273.54637: variable 'port1_profile' from source: play vars 11683 1726853273.54640: variable 'controller_profile' from source: play vars 11683 1726853273.54749: variable 'controller_profile' from source: play vars 11683 1726853273.54889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853273.54915: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853273.55067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.55170: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853273.55294: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853273.55840: variable 'network_connections' from source: task vars 11683 1726853273.55853: variable 'port2_profile' from source: play vars 11683 1726853273.56087: variable 'port2_profile' from source: play vars 11683 1726853273.56097: variable 'port1_profile' from source: play vars 11683 1726853273.56307: variable 'port1_profile' from source: play vars 11683 1726853273.56316: variable 'controller_profile' from source: play vars 11683 1726853273.56445: variable 'controller_profile' from source: play vars 11683 1726853273.56593: variable '__network_packages_default_wireless' from source: role '' defaults 11683 1726853273.56674: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853273.57429: variable 'network_connections' from source: task vars 11683 1726853273.57433: variable 'port2_profile' from source: play vars 11683 1726853273.57549: variable 'port2_profile' from source: play vars 11683 1726853273.57554: variable 'port1_profile' from source: play vars 11683 1726853273.57795: variable 'port1_profile' from source: play vars 11683 1726853273.57799: variable 'controller_profile' from source: play vars 11683 1726853273.57810: variable 'controller_profile' from source: play vars 11683 1726853273.57953: variable '__network_packages_default_team' from source: role '' defaults 11683 1726853273.58130: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853273.58891: variable 'network_connections' from source: task vars 11683 1726853273.58896: variable 'port2_profile' from source: play vars 11683 1726853273.58964: variable 'port2_profile' from source: play vars 11683 1726853273.59088: variable 'port1_profile' from source: play vars 11683 1726853273.59376: variable 'port1_profile' from source: play vars 11683 1726853273.59379: variable 'controller_profile' from source: play vars 11683 1726853273.59381: variable 'controller_profile' from source: play vars 11683 1726853273.59429: variable '__network_service_name_default_initscripts' from source: role '' defaults 11683 1726853273.59548: variable '__network_service_name_default_initscripts' from source: role '' defaults 11683 1726853273.59552: variable '__network_packages_default_initscripts' from source: role '' defaults 11683 1726853273.59731: variable '__network_packages_default_initscripts' from source: role '' defaults 11683 1726853273.60256: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11683 1726853273.61307: variable 'network_connections' from source: task vars 11683 1726853273.61311: variable 'port2_profile' from source: play vars 11683 1726853273.61484: variable 'port2_profile' from source: play vars 11683 1726853273.61491: variable 'port1_profile' from source: play vars 11683 1726853273.61558: variable 'port1_profile' from source: play vars 11683 1726853273.61565: variable 'controller_profile' from source: play vars 11683 1726853273.61747: variable 'controller_profile' from source: play vars 11683 1726853273.61752: variable 'ansible_distribution' from source: facts 11683 1726853273.61755: variable '__network_rh_distros' from source: role '' defaults 11683 1726853273.61763: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.61851: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11683 1726853273.62225: variable 'ansible_distribution' from source: facts 11683 1726853273.62228: variable '__network_rh_distros' from source: role '' defaults 11683 1726853273.62234: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.62249: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11683 1726853273.62623: variable 'ansible_distribution' from source: facts 11683 1726853273.62626: variable '__network_rh_distros' from source: role '' defaults 11683 1726853273.62632: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.62673: variable 'network_provider' from source: set_fact 11683 1726853273.62732: variable 'ansible_facts' from source: unknown 11683 1726853273.64224: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11683 1726853273.64229: when evaluation is False, skipping this task 11683 1726853273.64386: _execute() done 11683 1726853273.64390: dumping result to json 11683 1726853273.64393: done dumping result, returning 11683 1726853273.64395: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install packages [02083763-bbaf-c5b2-e075-000000000084] 11683 1726853273.64397: sending task result for task 02083763-bbaf-c5b2-e075-000000000084 11683 1726853273.64648: done sending task result for task 02083763-bbaf-c5b2-e075-000000000084 11683 1726853273.64652: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11683 1726853273.64708: no more pending results, returning what we have 11683 1726853273.64713: results queue empty 11683 1726853273.64714: checking for any_errors_fatal 11683 1726853273.64720: done checking for any_errors_fatal 11683 1726853273.64720: checking for max_fail_percentage 11683 1726853273.64722: done checking for max_fail_percentage 11683 1726853273.64723: checking to see if all hosts have failed and the running result is not ok 11683 1726853273.64724: done checking to see if all hosts have failed 11683 1726853273.64725: getting the remaining hosts for this loop 11683 1726853273.64727: done getting the remaining hosts for this loop 11683 1726853273.64736: getting the next task for host managed_node3 11683 1726853273.64746: done getting next task for host managed_node3 11683 1726853273.64750: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11683 1726853273.64754: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853273.64776: getting variables 11683 1726853273.64778: in VariableManager get_vars() 11683 1726853273.64819: Calling all_inventory to load vars for managed_node3 11683 1726853273.64821: Calling groups_inventory to load vars for managed_node3 11683 1726853273.64824: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853273.64835: Calling all_plugins_play to load vars for managed_node3 11683 1726853273.64837: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853273.64839: Calling groups_plugins_play to load vars for managed_node3 11683 1726853273.66756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.68762: done with get_vars() 11683 1726853273.68796: done getting variables 11683 1726853273.68917: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 13:27:53 -0400 (0:00:00.235) 0:00:26.761 ****** 11683 1726853273.68965: entering _queue_task() for managed_node3/package 11683 1726853273.69437: worker is 1 (out of 1 available) 11683 1726853273.69454: exiting _queue_task() for managed_node3/package 11683 1726853273.69467: done queuing things up, now waiting for results queue to drain 11683 1726853273.69468: waiting for pending results... 11683 1726853273.69758: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11683 1726853273.69947: in run() - task 02083763-bbaf-c5b2-e075-000000000085 11683 1726853273.69952: variable 'ansible_search_path' from source: unknown 11683 1726853273.69955: variable 'ansible_search_path' from source: unknown 11683 1726853273.69991: calling self._execute() 11683 1726853273.70163: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.70167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.70169: variable 'omit' from source: magic vars 11683 1726853273.70534: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.70556: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.70693: variable 'network_state' from source: role '' defaults 11683 1726853273.70716: Evaluated conditional (network_state != {}): False 11683 1726853273.70725: when evaluation is False, skipping this task 11683 1726853273.70733: _execute() done 11683 1726853273.70742: dumping result to json 11683 1726853273.70814: done dumping result, returning 11683 1726853273.70818: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [02083763-bbaf-c5b2-e075-000000000085] 11683 1726853273.70821: sending task result for task 02083763-bbaf-c5b2-e075-000000000085 11683 1726853273.70895: done sending task result for task 02083763-bbaf-c5b2-e075-000000000085 11683 1726853273.70898: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853273.70975: no more pending results, returning what we have 11683 1726853273.70979: results queue empty 11683 1726853273.70980: checking for any_errors_fatal 11683 1726853273.70988: done checking for any_errors_fatal 11683 1726853273.70989: checking for max_fail_percentage 11683 1726853273.70991: done checking for max_fail_percentage 11683 1726853273.70992: checking to see if all hosts have failed and the running result is not ok 11683 1726853273.70993: done checking to see if all hosts have failed 11683 1726853273.70994: getting the remaining hosts for this loop 11683 1726853273.70996: done getting the remaining hosts for this loop 11683 1726853273.70999: getting the next task for host managed_node3 11683 1726853273.71007: done getting next task for host managed_node3 11683 1726853273.71011: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11683 1726853273.71015: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853273.71046: getting variables 11683 1726853273.71048: in VariableManager get_vars() 11683 1726853273.71095: Calling all_inventory to load vars for managed_node3 11683 1726853273.71098: Calling groups_inventory to load vars for managed_node3 11683 1726853273.71101: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853273.71114: Calling all_plugins_play to load vars for managed_node3 11683 1726853273.71117: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853273.71120: Calling groups_plugins_play to load vars for managed_node3 11683 1726853273.74017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.75356: done with get_vars() 11683 1726853273.75383: done getting variables 11683 1726853273.75430: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 13:27:53 -0400 (0:00:00.064) 0:00:26.826 ****** 11683 1726853273.75461: entering _queue_task() for managed_node3/package 11683 1726853273.75732: worker is 1 (out of 1 available) 11683 1726853273.75748: exiting _queue_task() for managed_node3/package 11683 1726853273.75759: done queuing things up, now waiting for results queue to drain 11683 1726853273.75761: waiting for pending results... 11683 1726853273.75980: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11683 1726853273.76112: in run() - task 02083763-bbaf-c5b2-e075-000000000086 11683 1726853273.76118: variable 'ansible_search_path' from source: unknown 11683 1726853273.76195: variable 'ansible_search_path' from source: unknown 11683 1726853273.76199: calling self._execute() 11683 1726853273.76249: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.76253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.76263: variable 'omit' from source: magic vars 11683 1726853273.76682: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.76686: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.76738: variable 'network_state' from source: role '' defaults 11683 1726853273.76749: Evaluated conditional (network_state != {}): False 11683 1726853273.76752: when evaluation is False, skipping this task 11683 1726853273.76756: _execute() done 11683 1726853273.76759: dumping result to json 11683 1726853273.76761: done dumping result, returning 11683 1726853273.76775: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [02083763-bbaf-c5b2-e075-000000000086] 11683 1726853273.76778: sending task result for task 02083763-bbaf-c5b2-e075-000000000086 11683 1726853273.76874: done sending task result for task 02083763-bbaf-c5b2-e075-000000000086 11683 1726853273.76877: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853273.76960: no more pending results, returning what we have 11683 1726853273.76964: results queue empty 11683 1726853273.76965: checking for any_errors_fatal 11683 1726853273.77009: done checking for any_errors_fatal 11683 1726853273.77011: checking for max_fail_percentage 11683 1726853273.77018: done checking for max_fail_percentage 11683 1726853273.77020: checking to see if all hosts have failed and the running result is not ok 11683 1726853273.77021: done checking to see if all hosts have failed 11683 1726853273.77022: getting the remaining hosts for this loop 11683 1726853273.77024: done getting the remaining hosts for this loop 11683 1726853273.77027: getting the next task for host managed_node3 11683 1726853273.77034: done getting next task for host managed_node3 11683 1726853273.77038: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11683 1726853273.77041: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853273.77059: getting variables 11683 1726853273.77060: in VariableManager get_vars() 11683 1726853273.77099: Calling all_inventory to load vars for managed_node3 11683 1726853273.77107: Calling groups_inventory to load vars for managed_node3 11683 1726853273.77176: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853273.77188: Calling all_plugins_play to load vars for managed_node3 11683 1726853273.77191: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853273.77199: Calling groups_plugins_play to load vars for managed_node3 11683 1726853273.80169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.82236: done with get_vars() 11683 1726853273.82279: done getting variables 11683 1726853273.82342: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 13:27:53 -0400 (0:00:00.069) 0:00:26.895 ****** 11683 1726853273.82391: entering _queue_task() for managed_node3/service 11683 1726853273.82980: worker is 1 (out of 1 available) 11683 1726853273.82992: exiting _queue_task() for managed_node3/service 11683 1726853273.83002: done queuing things up, now waiting for results queue to drain 11683 1726853273.83003: waiting for pending results... 11683 1726853273.83955: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11683 1726853273.84034: in run() - task 02083763-bbaf-c5b2-e075-000000000087 11683 1726853273.84269: variable 'ansible_search_path' from source: unknown 11683 1726853273.84275: variable 'ansible_search_path' from source: unknown 11683 1726853273.84278: calling self._execute() 11683 1726853273.84494: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.84508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.84526: variable 'omit' from source: magic vars 11683 1726853273.85116: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.85141: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.85244: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853273.85387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853273.86954: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853273.87015: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853273.87044: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853273.87076: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853273.87095: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853273.87157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.87181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.87201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.87228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.87239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.87276: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.87294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.87313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.87339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.87353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.87383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.87400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.87420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.87447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.87458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.87575: variable 'network_connections' from source: task vars 11683 1726853273.87586: variable 'port2_profile' from source: play vars 11683 1726853273.87639: variable 'port2_profile' from source: play vars 11683 1726853273.87650: variable 'port1_profile' from source: play vars 11683 1726853273.87696: variable 'port1_profile' from source: play vars 11683 1726853273.87703: variable 'controller_profile' from source: play vars 11683 1726853273.87749: variable 'controller_profile' from source: play vars 11683 1726853273.87804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853273.87930: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853273.87961: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853273.88003: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853273.88024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853273.88062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853273.88077: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853273.88097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.88116: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853273.88157: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853273.88317: variable 'network_connections' from source: task vars 11683 1726853273.88320: variable 'port2_profile' from source: play vars 11683 1726853273.88387: variable 'port2_profile' from source: play vars 11683 1726853273.88393: variable 'port1_profile' from source: play vars 11683 1726853273.88463: variable 'port1_profile' from source: play vars 11683 1726853273.88503: variable 'controller_profile' from source: play vars 11683 1726853273.88526: variable 'controller_profile' from source: play vars 11683 1726853273.88545: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11683 1726853273.88557: when evaluation is False, skipping this task 11683 1726853273.88560: _execute() done 11683 1726853273.88562: dumping result to json 11683 1726853273.88565: done dumping result, returning 11683 1726853273.88567: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [02083763-bbaf-c5b2-e075-000000000087] 11683 1726853273.88574: sending task result for task 02083763-bbaf-c5b2-e075-000000000087 11683 1726853273.88663: done sending task result for task 02083763-bbaf-c5b2-e075-000000000087 11683 1726853273.88666: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11683 1726853273.88713: no more pending results, returning what we have 11683 1726853273.88717: results queue empty 11683 1726853273.88718: checking for any_errors_fatal 11683 1726853273.88723: done checking for any_errors_fatal 11683 1726853273.88724: checking for max_fail_percentage 11683 1726853273.88726: done checking for max_fail_percentage 11683 1726853273.88727: checking to see if all hosts have failed and the running result is not ok 11683 1726853273.88728: done checking to see if all hosts have failed 11683 1726853273.88728: getting the remaining hosts for this loop 11683 1726853273.88730: done getting the remaining hosts for this loop 11683 1726853273.88733: getting the next task for host managed_node3 11683 1726853273.88740: done getting next task for host managed_node3 11683 1726853273.88744: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11683 1726853273.88748: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853273.88765: getting variables 11683 1726853273.88767: in VariableManager get_vars() 11683 1726853273.88809: Calling all_inventory to load vars for managed_node3 11683 1726853273.88812: Calling groups_inventory to load vars for managed_node3 11683 1726853273.88814: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853273.88825: Calling all_plugins_play to load vars for managed_node3 11683 1726853273.88827: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853273.88830: Calling groups_plugins_play to load vars for managed_node3 11683 1726853273.90027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853273.91413: done with get_vars() 11683 1726853273.91442: done getting variables 11683 1726853273.91516: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 13:27:53 -0400 (0:00:00.091) 0:00:26.987 ****** 11683 1726853273.91563: entering _queue_task() for managed_node3/service 11683 1726853273.91876: worker is 1 (out of 1 available) 11683 1726853273.91894: exiting _queue_task() for managed_node3/service 11683 1726853273.91908: done queuing things up, now waiting for results queue to drain 11683 1726853273.91909: waiting for pending results... 11683 1726853273.92239: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11683 1726853273.92311: in run() - task 02083763-bbaf-c5b2-e075-000000000088 11683 1726853273.92322: variable 'ansible_search_path' from source: unknown 11683 1726853273.92326: variable 'ansible_search_path' from source: unknown 11683 1726853273.92356: calling self._execute() 11683 1726853273.92434: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.92438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.92448: variable 'omit' from source: magic vars 11683 1726853273.92826: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.92829: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853273.92983: variable 'network_provider' from source: set_fact 11683 1726853273.92986: variable 'network_state' from source: role '' defaults 11683 1726853273.92996: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11683 1726853273.92999: variable 'omit' from source: magic vars 11683 1726853273.93059: variable 'omit' from source: magic vars 11683 1726853273.93087: variable 'network_service_name' from source: role '' defaults 11683 1726853273.93141: variable 'network_service_name' from source: role '' defaults 11683 1726853273.93216: variable '__network_provider_setup' from source: role '' defaults 11683 1726853273.93221: variable '__network_service_name_default_nm' from source: role '' defaults 11683 1726853273.93265: variable '__network_service_name_default_nm' from source: role '' defaults 11683 1726853273.93274: variable '__network_packages_default_nm' from source: role '' defaults 11683 1726853273.93333: variable '__network_packages_default_nm' from source: role '' defaults 11683 1726853273.93481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853273.95201: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853273.95251: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853273.95281: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853273.95320: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853273.95341: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853273.95406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.95427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.95446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.95472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.95483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.95519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.95539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.95564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.95602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.95629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.95786: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11683 1726853273.95867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.95885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.95902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.95926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.95936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.96012: variable 'ansible_python' from source: facts 11683 1726853273.96030: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11683 1726853273.96112: variable '__network_wpa_supplicant_required' from source: role '' defaults 11683 1726853273.96193: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11683 1726853273.96302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.96322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.96338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.96365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.96377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.96411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853273.96432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853273.96451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.96477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853273.96487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853273.96584: variable 'network_connections' from source: task vars 11683 1726853273.96591: variable 'port2_profile' from source: play vars 11683 1726853273.96648: variable 'port2_profile' from source: play vars 11683 1726853273.96658: variable 'port1_profile' from source: play vars 11683 1726853273.96707: variable 'port1_profile' from source: play vars 11683 1726853273.96717: variable 'controller_profile' from source: play vars 11683 1726853273.96774: variable 'controller_profile' from source: play vars 11683 1726853273.96845: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853273.96985: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853273.97021: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853273.97054: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853273.97089: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853273.97131: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853273.97154: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853273.97181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853273.97206: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853273.97242: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853273.97422: variable 'network_connections' from source: task vars 11683 1726853273.97428: variable 'port2_profile' from source: play vars 11683 1726853273.97483: variable 'port2_profile' from source: play vars 11683 1726853273.97491: variable 'port1_profile' from source: play vars 11683 1726853273.97542: variable 'port1_profile' from source: play vars 11683 1726853273.97554: variable 'controller_profile' from source: play vars 11683 1726853273.97605: variable 'controller_profile' from source: play vars 11683 1726853273.97632: variable '__network_packages_default_wireless' from source: role '' defaults 11683 1726853273.97689: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853273.97879: variable 'network_connections' from source: task vars 11683 1726853273.97882: variable 'port2_profile' from source: play vars 11683 1726853273.97931: variable 'port2_profile' from source: play vars 11683 1726853273.97938: variable 'port1_profile' from source: play vars 11683 1726853273.97990: variable 'port1_profile' from source: play vars 11683 1726853273.97996: variable 'controller_profile' from source: play vars 11683 1726853273.98043: variable 'controller_profile' from source: play vars 11683 1726853273.98065: variable '__network_packages_default_team' from source: role '' defaults 11683 1726853273.98119: variable '__network_team_connections_defined' from source: role '' defaults 11683 1726853273.98309: variable 'network_connections' from source: task vars 11683 1726853273.98312: variable 'port2_profile' from source: play vars 11683 1726853273.98363: variable 'port2_profile' from source: play vars 11683 1726853273.98369: variable 'port1_profile' from source: play vars 11683 1726853273.98435: variable 'port1_profile' from source: play vars 11683 1726853273.98439: variable 'controller_profile' from source: play vars 11683 1726853273.98485: variable 'controller_profile' from source: play vars 11683 1726853273.98524: variable '__network_service_name_default_initscripts' from source: role '' defaults 11683 1726853273.98566: variable '__network_service_name_default_initscripts' from source: role '' defaults 11683 1726853273.98573: variable '__network_packages_default_initscripts' from source: role '' defaults 11683 1726853273.98617: variable '__network_packages_default_initscripts' from source: role '' defaults 11683 1726853273.98753: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11683 1726853273.99078: variable 'network_connections' from source: task vars 11683 1726853273.99081: variable 'port2_profile' from source: play vars 11683 1726853273.99124: variable 'port2_profile' from source: play vars 11683 1726853273.99127: variable 'port1_profile' from source: play vars 11683 1726853273.99174: variable 'port1_profile' from source: play vars 11683 1726853273.99180: variable 'controller_profile' from source: play vars 11683 1726853273.99219: variable 'controller_profile' from source: play vars 11683 1726853273.99227: variable 'ansible_distribution' from source: facts 11683 1726853273.99230: variable '__network_rh_distros' from source: role '' defaults 11683 1726853273.99237: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.99256: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11683 1726853273.99362: variable 'ansible_distribution' from source: facts 11683 1726853273.99366: variable '__network_rh_distros' from source: role '' defaults 11683 1726853273.99373: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.99385: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11683 1726853273.99497: variable 'ansible_distribution' from source: facts 11683 1726853273.99500: variable '__network_rh_distros' from source: role '' defaults 11683 1726853273.99504: variable 'ansible_distribution_major_version' from source: facts 11683 1726853273.99529: variable 'network_provider' from source: set_fact 11683 1726853273.99549: variable 'omit' from source: magic vars 11683 1726853273.99574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853273.99597: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853273.99612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853273.99625: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853273.99633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853273.99660: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853273.99664: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.99668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.99736: Set connection var ansible_shell_executable to /bin/sh 11683 1726853273.99745: Set connection var ansible_timeout to 10 11683 1726853273.99753: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853273.99758: Set connection var ansible_pipelining to False 11683 1726853273.99760: Set connection var ansible_shell_type to sh 11683 1726853273.99762: Set connection var ansible_connection to ssh 11683 1726853273.99784: variable 'ansible_shell_executable' from source: unknown 11683 1726853273.99787: variable 'ansible_connection' from source: unknown 11683 1726853273.99789: variable 'ansible_module_compression' from source: unknown 11683 1726853273.99793: variable 'ansible_shell_type' from source: unknown 11683 1726853273.99795: variable 'ansible_shell_executable' from source: unknown 11683 1726853273.99798: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853273.99800: variable 'ansible_pipelining' from source: unknown 11683 1726853273.99802: variable 'ansible_timeout' from source: unknown 11683 1726853273.99804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853273.99879: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853273.99887: variable 'omit' from source: magic vars 11683 1726853273.99892: starting attempt loop 11683 1726853273.99895: running the handler 11683 1726853273.99953: variable 'ansible_facts' from source: unknown 11683 1726853274.00497: _low_level_execute_command(): starting 11683 1726853274.00506: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853274.01138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853274.01143: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853274.01150: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853274.01226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853274.01230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853274.01345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853274.03167: stdout chunk (state=3): >>>/root <<< 11683 1726853274.03238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853274.03241: stdout chunk (state=3): >>><<< 11683 1726853274.03260: stderr chunk (state=3): >>><<< 11683 1726853274.03283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853274.03296: _low_level_execute_command(): starting 11683 1726853274.03302: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798 `" && echo ansible-tmp-1726853274.0328312-12865-203457085418798="` echo /root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798 `" ) && sleep 0' 11683 1726853274.04126: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853274.04133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853274.04177: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853274.04204: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853274.04209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853274.04272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853274.04276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853274.04355: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853274.06616: stdout chunk (state=3): >>>ansible-tmp-1726853274.0328312-12865-203457085418798=/root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798 <<< 11683 1726853274.06621: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853274.06680: stderr chunk (state=3): >>><<< 11683 1726853274.06748: stdout chunk (state=3): >>><<< 11683 1726853274.06785: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853274.0328312-12865-203457085418798=/root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853274.06809: variable 'ansible_module_compression' from source: unknown 11683 1726853274.06861: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11683 1726853274.06926: variable 'ansible_facts' from source: unknown 11683 1726853274.07136: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/AnsiballZ_systemd.py 11683 1726853274.07361: Sending initial data 11683 1726853274.07374: Sent initial data (156 bytes) 11683 1726853274.07868: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853274.07914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853274.07920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853274.07922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853274.08022: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853274.08026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853274.08028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853274.08037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853274.08133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853274.09849: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11683 1726853274.09854: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853274.09927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853274.09997: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp_bt9x3i9 /root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/AnsiballZ_systemd.py <<< 11683 1726853274.10001: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/AnsiballZ_systemd.py" <<< 11683 1726853274.10068: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp_bt9x3i9" to remote "/root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/AnsiballZ_systemd.py" <<< 11683 1726853274.12440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853274.12445: stderr chunk (state=3): >>><<< 11683 1726853274.12493: stdout chunk (state=3): >>><<< 11683 1726853274.12508: done transferring module to remote 11683 1726853274.12523: _low_level_execute_command(): starting 11683 1726853274.12532: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/ /root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/AnsiballZ_systemd.py && sleep 0' 11683 1726853274.13198: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853274.13211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853274.13224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853274.13285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853274.13346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853274.13368: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853274.13415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853274.13489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853274.15394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853274.15398: stderr chunk (state=3): >>><<< 11683 1726853274.15408: stdout chunk (state=3): >>><<< 11683 1726853274.15494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853274.15497: _low_level_execute_command(): starting 11683 1726853274.15499: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/AnsiballZ_systemd.py && sleep 0' 11683 1726853274.17146: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853274.17155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853274.17157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853274.17159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853274.17175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853274.47060: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10448896", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317551104", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "698843000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11683 1726853274.47067: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit<<< 11683 1726853274.47087: stdout chunk (state=3): >>>.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11683 1726853274.49117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853274.49128: stdout chunk (state=3): >>><<< 11683 1726853274.49138: stderr chunk (state=3): >>><<< 11683 1726853274.49280: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainStartTimestampMonotonic": "24298536", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ExecMainHandoffTimestampMonotonic": "24318182", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10448896", "MemoryPeak": "14114816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3317551104", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "698843000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target shutdown.target cloud-init.service NetworkManager-wait-online.service multi-user.target", "After": "sysinit.target systemd-journald.socket basic.target cloud-init-local.service network-pre.target dbus.socket system.slice dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 13:26:57 EDT", "StateChangeTimestampMonotonic": "361843458", "InactiveExitTimestamp": "Fri 2024-09-20 13:21:20 EDT", "InactiveExitTimestampMonotonic": "24299070", "ActiveEnterTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ActiveEnterTimestampMonotonic": "24855925", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 13:21:20 EDT", "ConditionTimestampMonotonic": "24297535", "AssertTimestamp": "Fri 2024-09-20 13:21:20 EDT", "AssertTimestampMonotonic": "24297537", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "125a1bdc44cb4bffa8aeca788d2f2fa3", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853274.49611: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853274.49637: _low_level_execute_command(): starting 11683 1726853274.49683: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853274.0328312-12865-203457085418798/ > /dev/null 2>&1 && sleep 0' 11683 1726853274.50719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853274.50723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853274.50725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853274.50727: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853274.50729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853274.50869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853274.51209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853274.51278: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853274.53255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853274.53259: stdout chunk (state=3): >>><<< 11683 1726853274.53262: stderr chunk (state=3): >>><<< 11683 1726853274.53283: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853274.53299: handler run complete 11683 1726853274.53476: attempt loop complete, returning result 11683 1726853274.53479: _execute() done 11683 1726853274.53482: dumping result to json 11683 1726853274.53484: done dumping result, returning 11683 1726853274.53498: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [02083763-bbaf-c5b2-e075-000000000088] 11683 1726853274.53776: sending task result for task 02083763-bbaf-c5b2-e075-000000000088 ok: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853274.54069: no more pending results, returning what we have 11683 1726853274.54075: results queue empty 11683 1726853274.54076: checking for any_errors_fatal 11683 1726853274.54081: done checking for any_errors_fatal 11683 1726853274.54082: checking for max_fail_percentage 11683 1726853274.54083: done checking for max_fail_percentage 11683 1726853274.54085: checking to see if all hosts have failed and the running result is not ok 11683 1726853274.54087: done checking to see if all hosts have failed 11683 1726853274.54088: getting the remaining hosts for this loop 11683 1726853274.54089: done getting the remaining hosts for this loop 11683 1726853274.54093: getting the next task for host managed_node3 11683 1726853274.54100: done getting next task for host managed_node3 11683 1726853274.54104: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11683 1726853274.54108: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853274.54120: getting variables 11683 1726853274.54122: in VariableManager get_vars() 11683 1726853274.54163: Calling all_inventory to load vars for managed_node3 11683 1726853274.54166: Calling groups_inventory to load vars for managed_node3 11683 1726853274.54168: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853274.54284: Calling all_plugins_play to load vars for managed_node3 11683 1726853274.54288: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853274.54291: Calling groups_plugins_play to load vars for managed_node3 11683 1726853274.55306: done sending task result for task 02083763-bbaf-c5b2-e075-000000000088 11683 1726853274.55310: WORKER PROCESS EXITING 11683 1726853274.56276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853274.57962: done with get_vars() 11683 1726853274.57996: done getting variables 11683 1726853274.58067: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 13:27:54 -0400 (0:00:00.665) 0:00:27.653 ****** 11683 1726853274.58112: entering _queue_task() for managed_node3/service 11683 1726853274.58597: worker is 1 (out of 1 available) 11683 1726853274.58609: exiting _queue_task() for managed_node3/service 11683 1726853274.58622: done queuing things up, now waiting for results queue to drain 11683 1726853274.58623: waiting for pending results... 11683 1726853274.58837: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11683 1726853274.59026: in run() - task 02083763-bbaf-c5b2-e075-000000000089 11683 1726853274.59053: variable 'ansible_search_path' from source: unknown 11683 1726853274.59064: variable 'ansible_search_path' from source: unknown 11683 1726853274.59113: calling self._execute() 11683 1726853274.59218: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853274.59229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853274.59241: variable 'omit' from source: magic vars 11683 1726853274.59632: variable 'ansible_distribution_major_version' from source: facts 11683 1726853274.59655: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853274.59780: variable 'network_provider' from source: set_fact 11683 1726853274.59791: Evaluated conditional (network_provider == "nm"): True 11683 1726853274.59894: variable '__network_wpa_supplicant_required' from source: role '' defaults 11683 1726853274.59985: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11683 1726853274.60163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853274.64939: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853274.64975: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853274.65094: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853274.65196: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853274.65227: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853274.65479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853274.65596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853274.65600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853274.65602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853274.65718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853274.65776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853274.65839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853274.65900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853274.65965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853274.66053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853274.66144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853274.66360: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853274.66364: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853274.66366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853274.66581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853274.66721: variable 'network_connections' from source: task vars 11683 1726853274.66737: variable 'port2_profile' from source: play vars 11683 1726853274.66921: variable 'port2_profile' from source: play vars 11683 1726853274.66936: variable 'port1_profile' from source: play vars 11683 1726853274.67000: variable 'port1_profile' from source: play vars 11683 1726853274.67085: variable 'controller_profile' from source: play vars 11683 1726853274.67149: variable 'controller_profile' from source: play vars 11683 1726853274.67309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11683 1726853274.67869: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11683 1726853274.67915: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11683 1726853274.67952: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11683 1726853274.67991: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11683 1726853274.68039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11683 1726853274.68109: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11683 1726853274.68221: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853274.68256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11683 1726853274.68393: variable '__network_wireless_connections_defined' from source: role '' defaults 11683 1726853274.68844: variable 'network_connections' from source: task vars 11683 1726853274.69069: variable 'port2_profile' from source: play vars 11683 1726853274.69074: variable 'port2_profile' from source: play vars 11683 1726853274.69076: variable 'port1_profile' from source: play vars 11683 1726853274.69222: variable 'port1_profile' from source: play vars 11683 1726853274.69248: variable 'controller_profile' from source: play vars 11683 1726853274.69503: variable 'controller_profile' from source: play vars 11683 1726853274.69507: Evaluated conditional (__network_wpa_supplicant_required): False 11683 1726853274.69509: when evaluation is False, skipping this task 11683 1726853274.69512: _execute() done 11683 1726853274.69514: dumping result to json 11683 1726853274.69516: done dumping result, returning 11683 1726853274.69518: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [02083763-bbaf-c5b2-e075-000000000089] 11683 1726853274.69520: sending task result for task 02083763-bbaf-c5b2-e075-000000000089 skipping: [managed_node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11683 1726853274.69776: no more pending results, returning what we have 11683 1726853274.69781: results queue empty 11683 1726853274.69782: checking for any_errors_fatal 11683 1726853274.69802: done checking for any_errors_fatal 11683 1726853274.69803: checking for max_fail_percentage 11683 1726853274.69805: done checking for max_fail_percentage 11683 1726853274.69806: checking to see if all hosts have failed and the running result is not ok 11683 1726853274.69807: done checking to see if all hosts have failed 11683 1726853274.69808: getting the remaining hosts for this loop 11683 1726853274.69809: done getting the remaining hosts for this loop 11683 1726853274.69813: getting the next task for host managed_node3 11683 1726853274.69822: done getting next task for host managed_node3 11683 1726853274.69826: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11683 1726853274.69830: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853274.69852: getting variables 11683 1726853274.69855: in VariableManager get_vars() 11683 1726853274.69901: Calling all_inventory to load vars for managed_node3 11683 1726853274.69904: Calling groups_inventory to load vars for managed_node3 11683 1726853274.69906: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853274.69920: Calling all_plugins_play to load vars for managed_node3 11683 1726853274.69923: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853274.69926: Calling groups_plugins_play to load vars for managed_node3 11683 1726853274.70867: done sending task result for task 02083763-bbaf-c5b2-e075-000000000089 11683 1726853274.70872: WORKER PROCESS EXITING 11683 1726853274.73565: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853274.76761: done with get_vars() 11683 1726853274.76800: done getting variables 11683 1726853274.76855: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 13:27:54 -0400 (0:00:00.189) 0:00:27.842 ****** 11683 1726853274.77095: entering _queue_task() for managed_node3/service 11683 1726853274.77659: worker is 1 (out of 1 available) 11683 1726853274.78076: exiting _queue_task() for managed_node3/service 11683 1726853274.78087: done queuing things up, now waiting for results queue to drain 11683 1726853274.78089: waiting for pending results... 11683 1726853274.78194: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service 11683 1726853274.78393: in run() - task 02083763-bbaf-c5b2-e075-00000000008a 11683 1726853274.78415: variable 'ansible_search_path' from source: unknown 11683 1726853274.78428: variable 'ansible_search_path' from source: unknown 11683 1726853274.78470: calling self._execute() 11683 1726853274.78577: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853274.78640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853274.78643: variable 'omit' from source: magic vars 11683 1726853274.78987: variable 'ansible_distribution_major_version' from source: facts 11683 1726853274.79004: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853274.79117: variable 'network_provider' from source: set_fact 11683 1726853274.79127: Evaluated conditional (network_provider == "initscripts"): False 11683 1726853274.79132: when evaluation is False, skipping this task 11683 1726853274.79138: _execute() done 11683 1726853274.79143: dumping result to json 11683 1726853274.79149: done dumping result, returning 11683 1726853274.79158: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Enable network service [02083763-bbaf-c5b2-e075-00000000008a] 11683 1726853274.79177: sending task result for task 02083763-bbaf-c5b2-e075-00000000008a skipping: [managed_node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11683 1726853274.79424: no more pending results, returning what we have 11683 1726853274.79428: results queue empty 11683 1726853274.79429: checking for any_errors_fatal 11683 1726853274.79438: done checking for any_errors_fatal 11683 1726853274.79439: checking for max_fail_percentage 11683 1726853274.79441: done checking for max_fail_percentage 11683 1726853274.79442: checking to see if all hosts have failed and the running result is not ok 11683 1726853274.79443: done checking to see if all hosts have failed 11683 1726853274.79443: getting the remaining hosts for this loop 11683 1726853274.79446: done getting the remaining hosts for this loop 11683 1726853274.79449: getting the next task for host managed_node3 11683 1726853274.79456: done getting next task for host managed_node3 11683 1726853274.79460: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11683 1726853274.79463: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853274.79495: getting variables 11683 1726853274.79497: in VariableManager get_vars() 11683 1726853274.79542: Calling all_inventory to load vars for managed_node3 11683 1726853274.79544: Calling groups_inventory to load vars for managed_node3 11683 1726853274.79547: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853274.79560: Calling all_plugins_play to load vars for managed_node3 11683 1726853274.79563: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853274.79566: Calling groups_plugins_play to load vars for managed_node3 11683 1726853274.79682: done sending task result for task 02083763-bbaf-c5b2-e075-00000000008a 11683 1726853274.79686: WORKER PROCESS EXITING 11683 1726853274.82017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853274.84734: done with get_vars() 11683 1726853274.84785: done getting variables 11683 1726853274.84841: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 13:27:54 -0400 (0:00:00.077) 0:00:27.920 ****** 11683 1726853274.84878: entering _queue_task() for managed_node3/copy 11683 1726853274.85218: worker is 1 (out of 1 available) 11683 1726853274.85230: exiting _queue_task() for managed_node3/copy 11683 1726853274.85241: done queuing things up, now waiting for results queue to drain 11683 1726853274.85242: waiting for pending results... 11683 1726853274.85874: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11683 1726853274.86122: in run() - task 02083763-bbaf-c5b2-e075-00000000008b 11683 1726853274.86270: variable 'ansible_search_path' from source: unknown 11683 1726853274.86283: variable 'ansible_search_path' from source: unknown 11683 1726853274.86427: calling self._execute() 11683 1726853274.86644: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853274.86903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853274.86907: variable 'omit' from source: magic vars 11683 1726853274.88061: variable 'ansible_distribution_major_version' from source: facts 11683 1726853274.88130: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853274.88558: variable 'network_provider' from source: set_fact 11683 1726853274.88615: Evaluated conditional (network_provider == "initscripts"): False 11683 1726853274.88924: when evaluation is False, skipping this task 11683 1726853274.88927: _execute() done 11683 1726853274.88930: dumping result to json 11683 1726853274.88932: done dumping result, returning 11683 1726853274.88936: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [02083763-bbaf-c5b2-e075-00000000008b] 11683 1726853274.88938: sending task result for task 02083763-bbaf-c5b2-e075-00000000008b 11683 1726853274.89016: done sending task result for task 02083763-bbaf-c5b2-e075-00000000008b 11683 1726853274.89019: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11683 1726853274.89075: no more pending results, returning what we have 11683 1726853274.89080: results queue empty 11683 1726853274.89081: checking for any_errors_fatal 11683 1726853274.89086: done checking for any_errors_fatal 11683 1726853274.89087: checking for max_fail_percentage 11683 1726853274.89089: done checking for max_fail_percentage 11683 1726853274.89090: checking to see if all hosts have failed and the running result is not ok 11683 1726853274.89091: done checking to see if all hosts have failed 11683 1726853274.89092: getting the remaining hosts for this loop 11683 1726853274.89093: done getting the remaining hosts for this loop 11683 1726853274.89096: getting the next task for host managed_node3 11683 1726853274.89104: done getting next task for host managed_node3 11683 1726853274.89107: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11683 1726853274.89111: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853274.89130: getting variables 11683 1726853274.89132: in VariableManager get_vars() 11683 1726853274.89178: Calling all_inventory to load vars for managed_node3 11683 1726853274.89181: Calling groups_inventory to load vars for managed_node3 11683 1726853274.89183: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853274.89197: Calling all_plugins_play to load vars for managed_node3 11683 1726853274.89200: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853274.89204: Calling groups_plugins_play to load vars for managed_node3 11683 1726853274.92387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853274.95289: done with get_vars() 11683 1726853274.95325: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 13:27:54 -0400 (0:00:00.105) 0:00:28.026 ****** 11683 1726853274.95423: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11683 1726853274.95775: worker is 1 (out of 1 available) 11683 1726853274.95788: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_connections 11683 1726853274.95800: done queuing things up, now waiting for results queue to drain 11683 1726853274.95801: waiting for pending results... 11683 1726853274.96189: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11683 1726853274.96239: in run() - task 02083763-bbaf-c5b2-e075-00000000008c 11683 1726853274.96264: variable 'ansible_search_path' from source: unknown 11683 1726853274.96275: variable 'ansible_search_path' from source: unknown 11683 1726853274.96320: calling self._execute() 11683 1726853274.96421: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853274.96434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853274.96453: variable 'omit' from source: magic vars 11683 1726853274.96823: variable 'ansible_distribution_major_version' from source: facts 11683 1726853274.96940: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853274.96944: variable 'omit' from source: magic vars 11683 1726853274.96946: variable 'omit' from source: magic vars 11683 1726853274.97177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11683 1726853274.99248: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11683 1726853274.99322: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11683 1726853274.99368: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11683 1726853274.99409: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11683 1726853274.99446: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11683 1726853274.99527: variable 'network_provider' from source: set_fact 11683 1726853274.99664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11683 1726853274.99716: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11683 1726853274.99761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11683 1726853274.99802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11683 1726853274.99825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11683 1726853274.99978: variable 'omit' from source: magic vars 11683 1726853275.00033: variable 'omit' from source: magic vars 11683 1726853275.00142: variable 'network_connections' from source: task vars 11683 1726853275.00160: variable 'port2_profile' from source: play vars 11683 1726853275.00233: variable 'port2_profile' from source: play vars 11683 1726853275.00248: variable 'port1_profile' from source: play vars 11683 1726853275.00315: variable 'port1_profile' from source: play vars 11683 1726853275.00328: variable 'controller_profile' from source: play vars 11683 1726853275.00390: variable 'controller_profile' from source: play vars 11683 1726853275.00561: variable 'omit' from source: magic vars 11683 1726853275.00577: variable '__lsr_ansible_managed' from source: task vars 11683 1726853275.00643: variable '__lsr_ansible_managed' from source: task vars 11683 1726853275.00847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11683 1726853275.01390: Loaded config def from plugin (lookup/template) 11683 1726853275.01404: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11683 1726853275.01476: File lookup term: get_ansible_managed.j2 11683 1726853275.01479: variable 'ansible_search_path' from source: unknown 11683 1726853275.01482: evaluation_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11683 1726853275.01486: search_path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11683 1726853275.01489: variable 'ansible_search_path' from source: unknown 11683 1726853275.07292: variable 'ansible_managed' from source: unknown 11683 1726853275.07440: variable 'omit' from source: magic vars 11683 1726853275.07479: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853275.07582: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853275.07585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853275.07588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853275.07590: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853275.07602: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853275.07616: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853275.07625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853275.07727: Set connection var ansible_shell_executable to /bin/sh 11683 1726853275.07744: Set connection var ansible_timeout to 10 11683 1726853275.07758: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853275.07768: Set connection var ansible_pipelining to False 11683 1726853275.07779: Set connection var ansible_shell_type to sh 11683 1726853275.07786: Set connection var ansible_connection to ssh 11683 1726853275.07812: variable 'ansible_shell_executable' from source: unknown 11683 1726853275.07820: variable 'ansible_connection' from source: unknown 11683 1726853275.07875: variable 'ansible_module_compression' from source: unknown 11683 1726853275.07878: variable 'ansible_shell_type' from source: unknown 11683 1726853275.07880: variable 'ansible_shell_executable' from source: unknown 11683 1726853275.07882: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853275.07883: variable 'ansible_pipelining' from source: unknown 11683 1726853275.07885: variable 'ansible_timeout' from source: unknown 11683 1726853275.07894: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853275.07992: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853275.08006: variable 'omit' from source: magic vars 11683 1726853275.08016: starting attempt loop 11683 1726853275.08021: running the handler 11683 1726853275.08036: _low_level_execute_command(): starting 11683 1726853275.08049: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853275.08722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853275.08738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853275.08755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853275.08803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853275.08817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853275.08886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853275.08911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853275.08929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853275.09024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853275.10765: stdout chunk (state=3): >>>/root <<< 11683 1726853275.10934: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853275.10937: stdout chunk (state=3): >>><<< 11683 1726853275.10940: stderr chunk (state=3): >>><<< 11683 1726853275.10965: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853275.10986: _low_level_execute_command(): starting 11683 1726853275.11079: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282 `" && echo ansible-tmp-1726853275.1097376-12922-211803067350282="` echo /root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282 `" ) && sleep 0' 11683 1726853275.11769: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853275.11775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853275.11778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853275.11780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853275.11783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853275.11817: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853275.11821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853275.11865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853275.11982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853275.13965: stdout chunk (state=3): >>>ansible-tmp-1726853275.1097376-12922-211803067350282=/root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282 <<< 11683 1726853275.14063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853275.14201: stderr chunk (state=3): >>><<< 11683 1726853275.14213: stdout chunk (state=3): >>><<< 11683 1726853275.14240: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853275.1097376-12922-211803067350282=/root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853275.14297: variable 'ansible_module_compression' from source: unknown 11683 1726853275.14596: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11683 1726853275.14599: variable 'ansible_facts' from source: unknown 11683 1726853275.14904: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/AnsiballZ_network_connections.py 11683 1726853275.15510: Sending initial data 11683 1726853275.15513: Sent initial data (168 bytes) 11683 1726853275.16904: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853275.16934: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853275.17024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853275.18918: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853275.18979: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853275.19166: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpourtix3s /root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/AnsiballZ_network_connections.py <<< 11683 1726853275.19173: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/AnsiballZ_network_connections.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpourtix3s" to remote "/root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/AnsiballZ_network_connections.py" <<< 11683 1726853275.20648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853275.20708: stderr chunk (state=3): >>><<< 11683 1726853275.20723: stdout chunk (state=3): >>><<< 11683 1726853275.20750: done transferring module to remote 11683 1726853275.20800: _low_level_execute_command(): starting 11683 1726853275.20836: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/ /root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/AnsiballZ_network_connections.py && sleep 0' 11683 1726853275.21448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853275.21464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853275.21580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853275.21592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853275.21610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853275.21735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853275.23681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853275.23692: stdout chunk (state=3): >>><<< 11683 1726853275.23702: stderr chunk (state=3): >>><<< 11683 1726853275.23733: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853275.23975: _low_level_execute_command(): starting 11683 1726853275.23979: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/AnsiballZ_network_connections.py && sleep 0' 11683 1726853275.24625: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853275.24687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853275.24751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853275.24775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853275.24799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853275.24904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853275.78954: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/3f4378cb-8ba1-4df0-ad5d-f4be4454b744: error=unknown <<< 11683 1726853275.80752: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/014f8cbf-bba4-4157-aa64-400d4d1c3b6d: error=unknown <<< 11683 1726853275.82547: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/3da92b17-45a3-429c-9e42-16f5e5b46354: error=unknown <<< 11683 1726853275.82773: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11683 1726853275.84753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853275.84757: stdout chunk (state=3): >>><<< 11683 1726853275.84759: stderr chunk (state=3): >>><<< 11683 1726853275.84910: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/3f4378cb-8ba1-4df0-ad5d-f4be4454b744: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/014f8cbf-bba4-4157-aa64-400d4d1c3b6d: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jef8hvqz/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/3da92b17-45a3-429c-9e42-16f5e5b46354: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853275.84918: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853275.84921: _low_level_execute_command(): starting 11683 1726853275.84923: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853275.1097376-12922-211803067350282/ > /dev/null 2>&1 && sleep 0' 11683 1726853275.85546: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853275.85561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853275.85590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853275.85690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853275.85694: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853275.85736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853275.85757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853275.85786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853275.85892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853275.87842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853275.87846: stdout chunk (state=3): >>><<< 11683 1726853275.87976: stderr chunk (state=3): >>><<< 11683 1726853275.87980: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853275.87982: handler run complete 11683 1726853275.87984: attempt loop complete, returning result 11683 1726853275.87986: _execute() done 11683 1726853275.87988: dumping result to json 11683 1726853275.87990: done dumping result, returning 11683 1726853275.87992: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [02083763-bbaf-c5b2-e075-00000000008c] 11683 1726853275.87994: sending task result for task 02083763-bbaf-c5b2-e075-00000000008c 11683 1726853275.88065: done sending task result for task 02083763-bbaf-c5b2-e075-00000000008c 11683 1726853275.88068: WORKER PROCESS EXITING changed: [managed_node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11683 1726853275.88312: no more pending results, returning what we have 11683 1726853275.88316: results queue empty 11683 1726853275.88317: checking for any_errors_fatal 11683 1726853275.88324: done checking for any_errors_fatal 11683 1726853275.88325: checking for max_fail_percentage 11683 1726853275.88327: done checking for max_fail_percentage 11683 1726853275.88328: checking to see if all hosts have failed and the running result is not ok 11683 1726853275.88329: done checking to see if all hosts have failed 11683 1726853275.88330: getting the remaining hosts for this loop 11683 1726853275.88331: done getting the remaining hosts for this loop 11683 1726853275.88334: getting the next task for host managed_node3 11683 1726853275.88341: done getting next task for host managed_node3 11683 1726853275.88348: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11683 1726853275.88352: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853275.88365: getting variables 11683 1726853275.88367: in VariableManager get_vars() 11683 1726853275.88517: Calling all_inventory to load vars for managed_node3 11683 1726853275.88520: Calling groups_inventory to load vars for managed_node3 11683 1726853275.88523: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853275.88539: Calling all_plugins_play to load vars for managed_node3 11683 1726853275.88545: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853275.88549: Calling groups_plugins_play to load vars for managed_node3 11683 1726853275.90283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853275.92008: done with get_vars() 11683 1726853275.92035: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 13:27:55 -0400 (0:00:00.967) 0:00:28.993 ****** 11683 1726853275.92132: entering _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11683 1726853275.92477: worker is 1 (out of 1 available) 11683 1726853275.92491: exiting _queue_task() for managed_node3/fedora.linux_system_roles.network_state 11683 1726853275.92502: done queuing things up, now waiting for results queue to drain 11683 1726853275.92503: waiting for pending results... 11683 1726853275.92891: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state 11683 1726853275.92961: in run() - task 02083763-bbaf-c5b2-e075-00000000008d 11683 1726853275.92994: variable 'ansible_search_path' from source: unknown 11683 1726853275.93005: variable 'ansible_search_path' from source: unknown 11683 1726853275.93048: calling self._execute() 11683 1726853275.93151: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853275.93162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853275.93181: variable 'omit' from source: magic vars 11683 1726853275.93555: variable 'ansible_distribution_major_version' from source: facts 11683 1726853275.93749: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853275.93884: variable 'network_state' from source: role '' defaults 11683 1726853275.93901: Evaluated conditional (network_state != {}): False 11683 1726853275.93909: when evaluation is False, skipping this task 11683 1726853275.93915: _execute() done 11683 1726853275.93922: dumping result to json 11683 1726853275.93929: done dumping result, returning 11683 1726853275.93940: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Configure networking state [02083763-bbaf-c5b2-e075-00000000008d] 11683 1726853275.93977: sending task result for task 02083763-bbaf-c5b2-e075-00000000008d 11683 1726853275.94279: done sending task result for task 02083763-bbaf-c5b2-e075-00000000008d 11683 1726853275.94283: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11683 1726853275.94341: no more pending results, returning what we have 11683 1726853275.94346: results queue empty 11683 1726853275.94347: checking for any_errors_fatal 11683 1726853275.94361: done checking for any_errors_fatal 11683 1726853275.94362: checking for max_fail_percentage 11683 1726853275.94364: done checking for max_fail_percentage 11683 1726853275.94365: checking to see if all hosts have failed and the running result is not ok 11683 1726853275.94366: done checking to see if all hosts have failed 11683 1726853275.94367: getting the remaining hosts for this loop 11683 1726853275.94369: done getting the remaining hosts for this loop 11683 1726853275.94475: getting the next task for host managed_node3 11683 1726853275.94482: done getting next task for host managed_node3 11683 1726853275.94486: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11683 1726853275.94490: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853275.94511: getting variables 11683 1726853275.94513: in VariableManager get_vars() 11683 1726853275.94551: Calling all_inventory to load vars for managed_node3 11683 1726853275.94553: Calling groups_inventory to load vars for managed_node3 11683 1726853275.94556: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853275.94568: Calling all_plugins_play to load vars for managed_node3 11683 1726853275.94570: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853275.94980: Calling groups_plugins_play to load vars for managed_node3 11683 1726853275.96606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853276.00386: done with get_vars() 11683 1726853276.00421: done getting variables 11683 1726853276.00891: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 13:27:56 -0400 (0:00:00.087) 0:00:29.081 ****** 11683 1726853276.00931: entering _queue_task() for managed_node3/debug 11683 1726853276.02087: worker is 1 (out of 1 available) 11683 1726853276.02099: exiting _queue_task() for managed_node3/debug 11683 1726853276.02110: done queuing things up, now waiting for results queue to drain 11683 1726853276.02111: waiting for pending results... 11683 1726853276.02841: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11683 1726853276.03404: in run() - task 02083763-bbaf-c5b2-e075-00000000008e 11683 1726853276.03409: variable 'ansible_search_path' from source: unknown 11683 1726853276.03413: variable 'ansible_search_path' from source: unknown 11683 1726853276.03416: calling self._execute() 11683 1726853276.03611: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.03615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.03627: variable 'omit' from source: magic vars 11683 1726853276.04509: variable 'ansible_distribution_major_version' from source: facts 11683 1726853276.04522: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853276.04579: variable 'omit' from source: magic vars 11683 1726853276.04788: variable 'omit' from source: magic vars 11683 1726853276.04828: variable 'omit' from source: magic vars 11683 1726853276.04918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853276.04959: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853276.04991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853276.05015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853276.05108: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853276.05111: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853276.05114: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.05119: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.05620: Set connection var ansible_shell_executable to /bin/sh 11683 1726853276.05637: Set connection var ansible_timeout to 10 11683 1726853276.05640: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853276.05642: Set connection var ansible_pipelining to False 11683 1726853276.05647: Set connection var ansible_shell_type to sh 11683 1726853276.05652: Set connection var ansible_connection to ssh 11683 1726853276.05678: variable 'ansible_shell_executable' from source: unknown 11683 1726853276.05681: variable 'ansible_connection' from source: unknown 11683 1726853276.05684: variable 'ansible_module_compression' from source: unknown 11683 1726853276.05687: variable 'ansible_shell_type' from source: unknown 11683 1726853276.05689: variable 'ansible_shell_executable' from source: unknown 11683 1726853276.05691: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.05695: variable 'ansible_pipelining' from source: unknown 11683 1726853276.05697: variable 'ansible_timeout' from source: unknown 11683 1726853276.05701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.05942: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853276.05962: variable 'omit' from source: magic vars 11683 1726853276.06076: starting attempt loop 11683 1726853276.06079: running the handler 11683 1726853276.06128: variable '__network_connections_result' from source: set_fact 11683 1726853276.06185: handler run complete 11683 1726853276.06212: attempt loop complete, returning result 11683 1726853276.06220: _execute() done 11683 1726853276.06228: dumping result to json 11683 1726853276.06235: done dumping result, returning 11683 1726853276.06248: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [02083763-bbaf-c5b2-e075-00000000008e] 11683 1726853276.06257: sending task result for task 02083763-bbaf-c5b2-e075-00000000008e 11683 1726853276.06535: done sending task result for task 02083763-bbaf-c5b2-e075-00000000008e 11683 1726853276.06538: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result.stderr_lines": [ "" ] } 11683 1726853276.06604: no more pending results, returning what we have 11683 1726853276.06608: results queue empty 11683 1726853276.06609: checking for any_errors_fatal 11683 1726853276.06614: done checking for any_errors_fatal 11683 1726853276.06615: checking for max_fail_percentage 11683 1726853276.06617: done checking for max_fail_percentage 11683 1726853276.06618: checking to see if all hosts have failed and the running result is not ok 11683 1726853276.06620: done checking to see if all hosts have failed 11683 1726853276.06620: getting the remaining hosts for this loop 11683 1726853276.06622: done getting the remaining hosts for this loop 11683 1726853276.06625: getting the next task for host managed_node3 11683 1726853276.06631: done getting next task for host managed_node3 11683 1726853276.06635: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11683 1726853276.06639: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853276.06650: getting variables 11683 1726853276.06652: in VariableManager get_vars() 11683 1726853276.06694: Calling all_inventory to load vars for managed_node3 11683 1726853276.06698: Calling groups_inventory to load vars for managed_node3 11683 1726853276.06700: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853276.06711: Calling all_plugins_play to load vars for managed_node3 11683 1726853276.06714: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853276.06717: Calling groups_plugins_play to load vars for managed_node3 11683 1726853276.08388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853276.09883: done with get_vars() 11683 1726853276.09912: done getting variables 11683 1726853276.09976: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 13:27:56 -0400 (0:00:00.090) 0:00:29.172 ****** 11683 1726853276.10013: entering _queue_task() for managed_node3/debug 11683 1726853276.10492: worker is 1 (out of 1 available) 11683 1726853276.10504: exiting _queue_task() for managed_node3/debug 11683 1726853276.10516: done queuing things up, now waiting for results queue to drain 11683 1726853276.10518: waiting for pending results... 11683 1726853276.11139: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11683 1726853276.11465: in run() - task 02083763-bbaf-c5b2-e075-00000000008f 11683 1726853276.11485: variable 'ansible_search_path' from source: unknown 11683 1726853276.11490: variable 'ansible_search_path' from source: unknown 11683 1726853276.11777: calling self._execute() 11683 1726853276.11803: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.11807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.11826: variable 'omit' from source: magic vars 11683 1726853276.12350: variable 'ansible_distribution_major_version' from source: facts 11683 1726853276.12362: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853276.12368: variable 'omit' from source: magic vars 11683 1726853276.12515: variable 'omit' from source: magic vars 11683 1726853276.12702: variable 'omit' from source: magic vars 11683 1726853276.12738: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853276.12773: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853276.12900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853276.12904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853276.12906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853276.12908: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853276.12910: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.12912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.12984: Set connection var ansible_shell_executable to /bin/sh 11683 1726853276.13000: Set connection var ansible_timeout to 10 11683 1726853276.13016: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853276.13117: Set connection var ansible_pipelining to False 11683 1726853276.13120: Set connection var ansible_shell_type to sh 11683 1726853276.13122: Set connection var ansible_connection to ssh 11683 1726853276.13124: variable 'ansible_shell_executable' from source: unknown 11683 1726853276.13126: variable 'ansible_connection' from source: unknown 11683 1726853276.13129: variable 'ansible_module_compression' from source: unknown 11683 1726853276.13130: variable 'ansible_shell_type' from source: unknown 11683 1726853276.13133: variable 'ansible_shell_executable' from source: unknown 11683 1726853276.13134: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.13136: variable 'ansible_pipelining' from source: unknown 11683 1726853276.13138: variable 'ansible_timeout' from source: unknown 11683 1726853276.13140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.13242: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853276.13259: variable 'omit' from source: magic vars 11683 1726853276.13269: starting attempt loop 11683 1726853276.13279: running the handler 11683 1726853276.13337: variable '__network_connections_result' from source: set_fact 11683 1726853276.13416: variable '__network_connections_result' from source: set_fact 11683 1726853276.13659: handler run complete 11683 1726853276.13662: attempt loop complete, returning result 11683 1726853276.13664: _execute() done 11683 1726853276.13666: dumping result to json 11683 1726853276.13668: done dumping result, returning 11683 1726853276.13670: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [02083763-bbaf-c5b2-e075-00000000008f] 11683 1726853276.13675: sending task result for task 02083763-bbaf-c5b2-e075-00000000008f 11683 1726853276.13746: done sending task result for task 02083763-bbaf-c5b2-e075-00000000008f 11683 1726853276.13750: WORKER PROCESS EXITING ok: [managed_node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11683 1726853276.13863: no more pending results, returning what we have 11683 1726853276.13868: results queue empty 11683 1726853276.13869: checking for any_errors_fatal 11683 1726853276.13878: done checking for any_errors_fatal 11683 1726853276.13879: checking for max_fail_percentage 11683 1726853276.13882: done checking for max_fail_percentage 11683 1726853276.13883: checking to see if all hosts have failed and the running result is not ok 11683 1726853276.13884: done checking to see if all hosts have failed 11683 1726853276.13885: getting the remaining hosts for this loop 11683 1726853276.13887: done getting the remaining hosts for this loop 11683 1726853276.13891: getting the next task for host managed_node3 11683 1726853276.13899: done getting next task for host managed_node3 11683 1726853276.13903: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11683 1726853276.13908: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853276.13921: getting variables 11683 1726853276.13925: in VariableManager get_vars() 11683 1726853276.13966: Calling all_inventory to load vars for managed_node3 11683 1726853276.13969: Calling groups_inventory to load vars for managed_node3 11683 1726853276.14081: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853276.14098: Calling all_plugins_play to load vars for managed_node3 11683 1726853276.14108: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853276.14113: Calling groups_plugins_play to load vars for managed_node3 11683 1726853276.16188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853276.18117: done with get_vars() 11683 1726853276.18145: done getting variables 11683 1726853276.18207: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 13:27:56 -0400 (0:00:00.082) 0:00:29.254 ****** 11683 1726853276.18245: entering _queue_task() for managed_node3/debug 11683 1726853276.18595: worker is 1 (out of 1 available) 11683 1726853276.18608: exiting _queue_task() for managed_node3/debug 11683 1726853276.18621: done queuing things up, now waiting for results queue to drain 11683 1726853276.18622: waiting for pending results... 11683 1726853276.18999: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11683 1726853276.19097: in run() - task 02083763-bbaf-c5b2-e075-000000000090 11683 1726853276.19102: variable 'ansible_search_path' from source: unknown 11683 1726853276.19277: variable 'ansible_search_path' from source: unknown 11683 1726853276.19280: calling self._execute() 11683 1726853276.19283: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.19285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.19288: variable 'omit' from source: magic vars 11683 1726853276.19785: variable 'ansible_distribution_major_version' from source: facts 11683 1726853276.19804: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853276.19929: variable 'network_state' from source: role '' defaults 11683 1726853276.19944: Evaluated conditional (network_state != {}): False 11683 1726853276.19952: when evaluation is False, skipping this task 11683 1726853276.19959: _execute() done 11683 1726853276.19965: dumping result to json 11683 1726853276.19976: done dumping result, returning 11683 1726853276.19988: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [02083763-bbaf-c5b2-e075-000000000090] 11683 1726853276.20003: sending task result for task 02083763-bbaf-c5b2-e075-000000000090 skipping: [managed_node3] => { "false_condition": "network_state != {}" } 11683 1726853276.20250: no more pending results, returning what we have 11683 1726853276.20254: results queue empty 11683 1726853276.20255: checking for any_errors_fatal 11683 1726853276.20273: done checking for any_errors_fatal 11683 1726853276.20274: checking for max_fail_percentage 11683 1726853276.20276: done checking for max_fail_percentage 11683 1726853276.20277: checking to see if all hosts have failed and the running result is not ok 11683 1726853276.20279: done checking to see if all hosts have failed 11683 1726853276.20279: getting the remaining hosts for this loop 11683 1726853276.20281: done getting the remaining hosts for this loop 11683 1726853276.20284: getting the next task for host managed_node3 11683 1726853276.20292: done getting next task for host managed_node3 11683 1726853276.20296: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11683 1726853276.20302: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853276.20323: getting variables 11683 1726853276.20325: in VariableManager get_vars() 11683 1726853276.20364: Calling all_inventory to load vars for managed_node3 11683 1726853276.20367: Calling groups_inventory to load vars for managed_node3 11683 1726853276.20370: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853276.20584: Calling all_plugins_play to load vars for managed_node3 11683 1726853276.20587: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853276.20591: Calling groups_plugins_play to load vars for managed_node3 11683 1726853276.21284: done sending task result for task 02083763-bbaf-c5b2-e075-000000000090 11683 1726853276.21288: WORKER PROCESS EXITING 11683 1726853276.26262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853276.27768: done with get_vars() 11683 1726853276.27796: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 13:27:56 -0400 (0:00:00.096) 0:00:29.350 ****** 11683 1726853276.27885: entering _queue_task() for managed_node3/ping 11683 1726853276.28241: worker is 1 (out of 1 available) 11683 1726853276.28255: exiting _queue_task() for managed_node3/ping 11683 1726853276.28266: done queuing things up, now waiting for results queue to drain 11683 1726853276.28267: waiting for pending results... 11683 1726853276.28560: running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 11683 1726853276.28726: in run() - task 02083763-bbaf-c5b2-e075-000000000091 11683 1726853276.28750: variable 'ansible_search_path' from source: unknown 11683 1726853276.28759: variable 'ansible_search_path' from source: unknown 11683 1726853276.28810: calling self._execute() 11683 1726853276.28910: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.28926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.28944: variable 'omit' from source: magic vars 11683 1726853276.29347: variable 'ansible_distribution_major_version' from source: facts 11683 1726853276.29365: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853276.29379: variable 'omit' from source: magic vars 11683 1726853276.29441: variable 'omit' from source: magic vars 11683 1726853276.29483: variable 'omit' from source: magic vars 11683 1726853276.29523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853276.29560: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853276.29592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853276.29613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853276.29628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853276.29660: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853276.29667: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.29678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.29769: Set connection var ansible_shell_executable to /bin/sh 11683 1726853276.29791: Set connection var ansible_timeout to 10 11683 1726853276.29802: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853276.29811: Set connection var ansible_pipelining to False 11683 1726853276.29817: Set connection var ansible_shell_type to sh 11683 1726853276.29823: Set connection var ansible_connection to ssh 11683 1726853276.29876: variable 'ansible_shell_executable' from source: unknown 11683 1726853276.29879: variable 'ansible_connection' from source: unknown 11683 1726853276.29882: variable 'ansible_module_compression' from source: unknown 11683 1726853276.29883: variable 'ansible_shell_type' from source: unknown 11683 1726853276.29885: variable 'ansible_shell_executable' from source: unknown 11683 1726853276.29892: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.29894: variable 'ansible_pipelining' from source: unknown 11683 1726853276.29897: variable 'ansible_timeout' from source: unknown 11683 1726853276.29899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.30110: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11683 1726853276.30115: variable 'omit' from source: magic vars 11683 1726853276.30118: starting attempt loop 11683 1726853276.30123: running the handler 11683 1726853276.30141: _low_level_execute_command(): starting 11683 1726853276.30218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853276.30987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.31003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.31109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.32861: stdout chunk (state=3): >>>/root <<< 11683 1726853276.33010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853276.33029: stdout chunk (state=3): >>><<< 11683 1726853276.33045: stderr chunk (state=3): >>><<< 11683 1726853276.33083: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853276.33104: _low_level_execute_command(): starting 11683 1726853276.33117: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610 `" && echo ansible-tmp-1726853276.3309104-13005-151659039587610="` echo /root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610 `" ) && sleep 0' 11683 1726853276.33754: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853276.33767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853276.33780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853276.33796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853276.33809: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853276.33818: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853276.33916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853276.33945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.33968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.34061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.36043: stdout chunk (state=3): >>>ansible-tmp-1726853276.3309104-13005-151659039587610=/root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610 <<< 11683 1726853276.36222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853276.36226: stdout chunk (state=3): >>><<< 11683 1726853276.36229: stderr chunk (state=3): >>><<< 11683 1726853276.36378: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853276.3309104-13005-151659039587610=/root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853276.36383: variable 'ansible_module_compression' from source: unknown 11683 1726853276.36386: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11683 1726853276.36403: variable 'ansible_facts' from source: unknown 11683 1726853276.36491: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/AnsiballZ_ping.py 11683 1726853276.36749: Sending initial data 11683 1726853276.36752: Sent initial data (153 bytes) 11683 1726853276.37343: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853276.37357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853276.37370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853276.37473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.37489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.37594: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.39587: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11683 1726853276.39592: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11683 1726853276.39594: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11683 1726853276.39597: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11683 1726853276.39604: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11683 1726853276.39611: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 11683 1726853276.39617: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 11683 1726853276.39625: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 11683 1726853276.39645: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853276.39987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853276.40015: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpbuptqxrx /root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/AnsiballZ_ping.py <<< 11683 1726853276.40019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/AnsiballZ_ping.py" <<< 11683 1726853276.40088: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpbuptqxrx" to remote "/root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/AnsiballZ_ping.py" <<< 11683 1726853276.41027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853276.41031: stderr chunk (state=3): >>><<< 11683 1726853276.41034: stdout chunk (state=3): >>><<< 11683 1726853276.41036: done transferring module to remote 11683 1726853276.41038: _low_level_execute_command(): starting 11683 1726853276.41040: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/ /root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/AnsiballZ_ping.py && sleep 0' 11683 1726853276.41721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853276.41753: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853276.41772: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853276.41859: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.41894: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.41993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.44026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853276.44030: stderr chunk (state=3): >>><<< 11683 1726853276.44180: stdout chunk (state=3): >>><<< 11683 1726853276.44184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853276.44187: _low_level_execute_command(): starting 11683 1726853276.44189: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/AnsiballZ_ping.py && sleep 0' 11683 1726853276.44902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853276.44927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853276.45051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853276.45086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.45160: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.45482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.60651: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11683 1726853276.62137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853276.62150: stdout chunk (state=3): >>><<< 11683 1726853276.62162: stderr chunk (state=3): >>><<< 11683 1726853276.62189: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853276.62218: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853276.62236: _low_level_execute_command(): starting 11683 1726853276.62246: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853276.3309104-13005-151659039587610/ > /dev/null 2>&1 && sleep 0' 11683 1726853276.62847: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853276.62864: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853276.62881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853276.62900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853276.62993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853276.63010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853276.63029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.63052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.63141: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.65089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853276.65093: stdout chunk (state=3): >>><<< 11683 1726853276.65099: stderr chunk (state=3): >>><<< 11683 1726853276.65126: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853276.65134: handler run complete 11683 1726853276.65152: attempt loop complete, returning result 11683 1726853276.65156: _execute() done 11683 1726853276.65158: dumping result to json 11683 1726853276.65160: done dumping result, returning 11683 1726853276.65172: done running TaskExecutor() for managed_node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [02083763-bbaf-c5b2-e075-000000000091] 11683 1726853276.65175: sending task result for task 02083763-bbaf-c5b2-e075-000000000091 11683 1726853276.65270: done sending task result for task 02083763-bbaf-c5b2-e075-000000000091 11683 1726853276.65275: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "ping": "pong" } 11683 1726853276.65348: no more pending results, returning what we have 11683 1726853276.65352: results queue empty 11683 1726853276.65353: checking for any_errors_fatal 11683 1726853276.65362: done checking for any_errors_fatal 11683 1726853276.65363: checking for max_fail_percentage 11683 1726853276.65364: done checking for max_fail_percentage 11683 1726853276.65365: checking to see if all hosts have failed and the running result is not ok 11683 1726853276.65367: done checking to see if all hosts have failed 11683 1726853276.65367: getting the remaining hosts for this loop 11683 1726853276.65369: done getting the remaining hosts for this loop 11683 1726853276.65578: getting the next task for host managed_node3 11683 1726853276.65589: done getting next task for host managed_node3 11683 1726853276.65591: ^ task is: TASK: meta (role_complete) 11683 1726853276.65596: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853276.65609: getting variables 11683 1726853276.65611: in VariableManager get_vars() 11683 1726853276.65654: Calling all_inventory to load vars for managed_node3 11683 1726853276.65657: Calling groups_inventory to load vars for managed_node3 11683 1726853276.65660: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853276.65670: Calling all_plugins_play to load vars for managed_node3 11683 1726853276.65679: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853276.65688: Calling groups_plugins_play to load vars for managed_node3 11683 1726853276.67230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853276.69269: done with get_vars() 11683 1726853276.69293: done getting variables 11683 1726853276.69376: done queuing things up, now waiting for results queue to drain 11683 1726853276.69379: results queue empty 11683 1726853276.69380: checking for any_errors_fatal 11683 1726853276.69382: done checking for any_errors_fatal 11683 1726853276.69383: checking for max_fail_percentage 11683 1726853276.69384: done checking for max_fail_percentage 11683 1726853276.69385: checking to see if all hosts have failed and the running result is not ok 11683 1726853276.69386: done checking to see if all hosts have failed 11683 1726853276.69386: getting the remaining hosts for this loop 11683 1726853276.69387: done getting the remaining hosts for this loop 11683 1726853276.69390: getting the next task for host managed_node3 11683 1726853276.69395: done getting next task for host managed_node3 11683 1726853276.69397: ^ task is: TASK: Delete the device '{{ controller_device }}' 11683 1726853276.69399: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853276.69402: getting variables 11683 1726853276.69402: in VariableManager get_vars() 11683 1726853276.69417: Calling all_inventory to load vars for managed_node3 11683 1726853276.69421: Calling groups_inventory to load vars for managed_node3 11683 1726853276.69423: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853276.69428: Calling all_plugins_play to load vars for managed_node3 11683 1726853276.69430: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853276.69433: Calling groups_plugins_play to load vars for managed_node3 11683 1726853276.70538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853276.72083: done with get_vars() 11683 1726853276.72112: done getting variables 11683 1726853276.72159: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11683 1726853276.72282: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Friday 20 September 2024 13:27:56 -0400 (0:00:00.444) 0:00:29.795 ****** 11683 1726853276.72311: entering _queue_task() for managed_node3/command 11683 1726853276.72645: worker is 1 (out of 1 available) 11683 1726853276.72657: exiting _queue_task() for managed_node3/command 11683 1726853276.72669: done queuing things up, now waiting for results queue to drain 11683 1726853276.72672: waiting for pending results... 11683 1726853276.73089: running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' 11683 1726853276.73096: in run() - task 02083763-bbaf-c5b2-e075-0000000000c1 11683 1726853276.73100: variable 'ansible_search_path' from source: unknown 11683 1726853276.73217: calling self._execute() 11683 1726853276.73240: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.73252: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.73270: variable 'omit' from source: magic vars 11683 1726853276.73660: variable 'ansible_distribution_major_version' from source: facts 11683 1726853276.73680: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853276.73692: variable 'omit' from source: magic vars 11683 1726853276.73717: variable 'omit' from source: magic vars 11683 1726853276.73818: variable 'controller_device' from source: play vars 11683 1726853276.73843: variable 'omit' from source: magic vars 11683 1726853276.73894: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853276.73936: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853276.73963: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853276.74076: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853276.74081: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853276.74083: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853276.74086: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.74088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.74158: Set connection var ansible_shell_executable to /bin/sh 11683 1726853276.74176: Set connection var ansible_timeout to 10 11683 1726853276.74186: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853276.74193: Set connection var ansible_pipelining to False 11683 1726853276.74276: Set connection var ansible_shell_type to sh 11683 1726853276.74279: Set connection var ansible_connection to ssh 11683 1726853276.74281: variable 'ansible_shell_executable' from source: unknown 11683 1726853276.74284: variable 'ansible_connection' from source: unknown 11683 1726853276.74286: variable 'ansible_module_compression' from source: unknown 11683 1726853276.74288: variable 'ansible_shell_type' from source: unknown 11683 1726853276.74290: variable 'ansible_shell_executable' from source: unknown 11683 1726853276.74292: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853276.74294: variable 'ansible_pipelining' from source: unknown 11683 1726853276.74296: variable 'ansible_timeout' from source: unknown 11683 1726853276.74298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853276.74409: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853276.74429: variable 'omit' from source: magic vars 11683 1726853276.74437: starting attempt loop 11683 1726853276.74443: running the handler 11683 1726853276.74460: _low_level_execute_command(): starting 11683 1726853276.74472: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853276.75222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853276.75237: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853276.75250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853276.75287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11683 1726853276.75380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.75425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.75497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.77367: stdout chunk (state=3): >>>/root <<< 11683 1726853276.77576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853276.77580: stdout chunk (state=3): >>><<< 11683 1726853276.77582: stderr chunk (state=3): >>><<< 11683 1726853276.77586: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853276.77589: _low_level_execute_command(): starting 11683 1726853276.77591: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329 `" && echo ansible-tmp-1726853276.7750373-13029-11180964429329="` echo /root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329 `" ) && sleep 0' 11683 1726853276.78121: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853276.78132: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853276.78145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853276.78163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853276.78180: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853276.78197: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853276.78210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853276.78227: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11683 1726853276.78237: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853276.78319: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.78341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.78426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.80569: stdout chunk (state=3): >>>ansible-tmp-1726853276.7750373-13029-11180964429329=/root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329 <<< 11683 1726853276.80638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853276.80649: stdout chunk (state=3): >>><<< 11683 1726853276.80666: stderr chunk (state=3): >>><<< 11683 1726853276.80978: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853276.7750373-13029-11180964429329=/root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853276.80982: variable 'ansible_module_compression' from source: unknown 11683 1726853276.80984: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853276.80987: variable 'ansible_facts' from source: unknown 11683 1726853276.81205: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/AnsiballZ_command.py 11683 1726853276.81513: Sending initial data 11683 1726853276.81517: Sent initial data (155 bytes) 11683 1726853276.82605: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853276.82661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853276.82680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.82694: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.82782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.84698: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853276.84703: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853276.84705: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp7m3kb43f /root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/AnsiballZ_command.py <<< 11683 1726853276.84708: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/AnsiballZ_command.py" <<< 11683 1726853276.84737: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp7m3kb43f" to remote "/root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/AnsiballZ_command.py" <<< 11683 1726853276.86259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853276.86337: stderr chunk (state=3): >>><<< 11683 1726853276.86340: stdout chunk (state=3): >>><<< 11683 1726853276.86392: done transferring module to remote 11683 1726853276.86395: _low_level_execute_command(): starting 11683 1726853276.86398: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/ /root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/AnsiballZ_command.py && sleep 0' 11683 1726853276.87160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853276.87168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853276.87173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.87224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.87285: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853276.89261: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853276.89265: stdout chunk (state=3): >>><<< 11683 1726853276.89275: stderr chunk (state=3): >>><<< 11683 1726853276.89465: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853276.89469: _low_level_execute_command(): starting 11683 1726853276.89474: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/AnsiballZ_command.py && sleep 0' 11683 1726853276.90150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853276.90154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853276.90157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853276.90160: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address <<< 11683 1726853276.90163: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853276.90166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853276.90262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853276.90266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853276.90337: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.06899: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:27:57.058153", "end": "2024-09-20 13:27:57.066077", "delta": "0:00:00.007924", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853277.08446: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. <<< 11683 1726853277.08450: stdout chunk (state=3): >>><<< 11683 1726853277.08453: stderr chunk (state=3): >>><<< 11683 1726853277.08455: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 13:27:57.058153", "end": "2024-09-20 13:27:57.066077", "delta": "0:00:00.007924", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.217 closed. 11683 1726853277.08585: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853277.08590: _low_level_execute_command(): starting 11683 1726853277.08592: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853276.7750373-13029-11180964429329/ > /dev/null 2>&1 && sleep 0' 11683 1726853277.09300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853277.09320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853277.09392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853277.09457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853277.09476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.09695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.09772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.11880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.11884: stdout chunk (state=3): >>><<< 11683 1726853277.11887: stderr chunk (state=3): >>><<< 11683 1726853277.11889: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853277.11892: handler run complete 11683 1726853277.11894: Evaluated conditional (False): False 11683 1726853277.11896: Evaluated conditional (False): False 11683 1726853277.11898: attempt loop complete, returning result 11683 1726853277.11900: _execute() done 11683 1726853277.11902: dumping result to json 11683 1726853277.11904: done dumping result, returning 11683 1726853277.11906: done running TaskExecutor() for managed_node3/TASK: Delete the device 'nm-bond' [02083763-bbaf-c5b2-e075-0000000000c1] 11683 1726853277.11908: sending task result for task 02083763-bbaf-c5b2-e075-0000000000c1 11683 1726853277.11999: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000c1 11683 1726853277.12002: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007924", "end": "2024-09-20 13:27:57.066077", "failed_when_result": false, "rc": 1, "start": "2024-09-20 13:27:57.058153" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11683 1726853277.12089: no more pending results, returning what we have 11683 1726853277.12094: results queue empty 11683 1726853277.12095: checking for any_errors_fatal 11683 1726853277.12097: done checking for any_errors_fatal 11683 1726853277.12098: checking for max_fail_percentage 11683 1726853277.12100: done checking for max_fail_percentage 11683 1726853277.12101: checking to see if all hosts have failed and the running result is not ok 11683 1726853277.12102: done checking to see if all hosts have failed 11683 1726853277.12103: getting the remaining hosts for this loop 11683 1726853277.12105: done getting the remaining hosts for this loop 11683 1726853277.12109: getting the next task for host managed_node3 11683 1726853277.12122: done getting next task for host managed_node3 11683 1726853277.12126: ^ task is: TASK: Remove test interfaces 11683 1726853277.12129: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853277.12135: getting variables 11683 1726853277.12138: in VariableManager get_vars() 11683 1726853277.12240: Calling all_inventory to load vars for managed_node3 11683 1726853277.12246: Calling groups_inventory to load vars for managed_node3 11683 1726853277.12249: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853277.12262: Calling all_plugins_play to load vars for managed_node3 11683 1726853277.12265: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853277.12269: Calling groups_plugins_play to load vars for managed_node3 11683 1726853277.14250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853277.16998: done with get_vars() 11683 1726853277.17029: done getting variables 11683 1726853277.17093: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 13:27:57 -0400 (0:00:00.448) 0:00:30.243 ****** 11683 1726853277.17126: entering _queue_task() for managed_node3/shell 11683 1726853277.17533: worker is 1 (out of 1 available) 11683 1726853277.17550: exiting _queue_task() for managed_node3/shell 11683 1726853277.17564: done queuing things up, now waiting for results queue to drain 11683 1726853277.17565: waiting for pending results... 11683 1726853277.17920: running TaskExecutor() for managed_node3/TASK: Remove test interfaces 11683 1726853277.17983: in run() - task 02083763-bbaf-c5b2-e075-0000000000c5 11683 1726853277.18018: variable 'ansible_search_path' from source: unknown 11683 1726853277.18022: variable 'ansible_search_path' from source: unknown 11683 1726853277.18039: calling self._execute() 11683 1726853277.18246: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853277.18250: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853277.18253: variable 'omit' from source: magic vars 11683 1726853277.18781: variable 'ansible_distribution_major_version' from source: facts 11683 1726853277.18785: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853277.18788: variable 'omit' from source: magic vars 11683 1726853277.18790: variable 'omit' from source: magic vars 11683 1726853277.18877: variable 'dhcp_interface1' from source: play vars 11683 1726853277.18882: variable 'dhcp_interface2' from source: play vars 11683 1726853277.18885: variable 'omit' from source: magic vars 11683 1726853277.18888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853277.18908: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853277.18923: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853277.18940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853277.18952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853277.18985: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853277.18989: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853277.19001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853277.19114: Set connection var ansible_shell_executable to /bin/sh 11683 1726853277.19117: Set connection var ansible_timeout to 10 11683 1726853277.19120: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853277.19146: Set connection var ansible_pipelining to False 11683 1726853277.19150: Set connection var ansible_shell_type to sh 11683 1726853277.19152: Set connection var ansible_connection to ssh 11683 1726853277.19156: variable 'ansible_shell_executable' from source: unknown 11683 1726853277.19158: variable 'ansible_connection' from source: unknown 11683 1726853277.19161: variable 'ansible_module_compression' from source: unknown 11683 1726853277.19163: variable 'ansible_shell_type' from source: unknown 11683 1726853277.19165: variable 'ansible_shell_executable' from source: unknown 11683 1726853277.19167: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853277.19169: variable 'ansible_pipelining' from source: unknown 11683 1726853277.19174: variable 'ansible_timeout' from source: unknown 11683 1726853277.19176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853277.19509: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853277.19512: variable 'omit' from source: magic vars 11683 1726853277.19515: starting attempt loop 11683 1726853277.19517: running the handler 11683 1726853277.19519: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853277.19522: _low_level_execute_command(): starting 11683 1726853277.19524: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853277.20088: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853277.20101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853277.20113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853277.20192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853277.20306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853277.20312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.20314: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.20377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.22110: stdout chunk (state=3): >>>/root <<< 11683 1726853277.22263: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.22278: stdout chunk (state=3): >>><<< 11683 1726853277.22295: stderr chunk (state=3): >>><<< 11683 1726853277.22345: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853277.22377: _low_level_execute_command(): starting 11683 1726853277.22396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119 `" && echo ansible-tmp-1726853277.2236102-13062-74567939966119="` echo /root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119 `" ) && sleep 0' 11683 1726853277.23318: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853277.23334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853277.23337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.23339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.23410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.25411: stdout chunk (state=3): >>>ansible-tmp-1726853277.2236102-13062-74567939966119=/root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119 <<< 11683 1726853277.25580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.25584: stdout chunk (state=3): >>><<< 11683 1726853277.25777: stderr chunk (state=3): >>><<< 11683 1726853277.25781: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853277.2236102-13062-74567939966119=/root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853277.25784: variable 'ansible_module_compression' from source: unknown 11683 1726853277.25786: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853277.25789: variable 'ansible_facts' from source: unknown 11683 1726853277.25857: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/AnsiballZ_command.py 11683 1726853277.26035: Sending initial data 11683 1726853277.26046: Sent initial data (155 bytes) 11683 1726853277.26773: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853277.26777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853277.26806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.26821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.26920: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.28593: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853277.28682: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853277.28745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpq1s7dk85 /root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/AnsiballZ_command.py <<< 11683 1726853277.28750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/AnsiballZ_command.py" <<< 11683 1726853277.28812: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpq1s7dk85" to remote "/root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/AnsiballZ_command.py" <<< 11683 1726853277.29640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.29735: stderr chunk (state=3): >>><<< 11683 1726853277.29739: stdout chunk (state=3): >>><<< 11683 1726853277.29741: done transferring module to remote 11683 1726853277.29743: _low_level_execute_command(): starting 11683 1726853277.29746: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/ /root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/AnsiballZ_command.py && sleep 0' 11683 1726853277.30425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853277.30493: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853277.30599: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853277.30630: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.30663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.30725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.32689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.32693: stdout chunk (state=3): >>><<< 11683 1726853277.32701: stderr chunk (state=3): >>><<< 11683 1726853277.32796: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853277.32799: _low_level_execute_command(): starting 11683 1726853277.32802: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/AnsiballZ_command.py && sleep 0' 11683 1726853277.33338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853277.33356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853277.33372: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853277.33392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853277.33491: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853277.33506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.33598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.33903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.54443: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:27:57.494093", "end": "2024-09-20 13:27:57.543078", "delta": "0:00:00.048985", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853277.56163: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853277.56177: stdout chunk (state=3): >>><<< 11683 1726853277.56190: stderr chunk (state=3): >>><<< 11683 1726853277.56344: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 13:27:57.494093", "end": "2024-09-20 13:27:57.543078", "delta": "0:00:00.048985", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853277.56348: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853277.56355: _low_level_execute_command(): starting 11683 1726853277.56358: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853277.2236102-13062-74567939966119/ > /dev/null 2>&1 && sleep 0' 11683 1726853277.57338: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853277.57341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853277.57344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853277.57347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853277.57349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853277.57876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.57897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.59810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.59842: stderr chunk (state=3): >>><<< 11683 1726853277.59846: stdout chunk (state=3): >>><<< 11683 1726853277.59865: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853277.59874: handler run complete 11683 1726853277.59895: Evaluated conditional (False): False 11683 1726853277.59903: attempt loop complete, returning result 11683 1726853277.59906: _execute() done 11683 1726853277.59908: dumping result to json 11683 1726853277.59929: done dumping result, returning 11683 1726853277.59932: done running TaskExecutor() for managed_node3/TASK: Remove test interfaces [02083763-bbaf-c5b2-e075-0000000000c5] 11683 1726853277.59934: sending task result for task 02083763-bbaf-c5b2-e075-0000000000c5 11683 1726853277.60097: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000c5 11683 1726853277.60100: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.048985", "end": "2024-09-20 13:27:57.543078", "rc": 0, "start": "2024-09-20 13:27:57.494093" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11683 1726853277.60163: no more pending results, returning what we have 11683 1726853277.60166: results queue empty 11683 1726853277.60167: checking for any_errors_fatal 11683 1726853277.60181: done checking for any_errors_fatal 11683 1726853277.60182: checking for max_fail_percentage 11683 1726853277.60183: done checking for max_fail_percentage 11683 1726853277.60184: checking to see if all hosts have failed and the running result is not ok 11683 1726853277.60185: done checking to see if all hosts have failed 11683 1726853277.60186: getting the remaining hosts for this loop 11683 1726853277.60187: done getting the remaining hosts for this loop 11683 1726853277.60190: getting the next task for host managed_node3 11683 1726853277.60196: done getting next task for host managed_node3 11683 1726853277.60199: ^ task is: TASK: Stop dnsmasq/radvd services 11683 1726853277.60202: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853277.60205: getting variables 11683 1726853277.60207: in VariableManager get_vars() 11683 1726853277.60244: Calling all_inventory to load vars for managed_node3 11683 1726853277.60246: Calling groups_inventory to load vars for managed_node3 11683 1726853277.60248: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853277.60258: Calling all_plugins_play to load vars for managed_node3 11683 1726853277.60261: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853277.60263: Calling groups_plugins_play to load vars for managed_node3 11683 1726853277.62225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853277.64129: done with get_vars() 11683 1726853277.64164: done getting variables 11683 1726853277.64230: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 13:27:57 -0400 (0:00:00.471) 0:00:30.714 ****** 11683 1726853277.64273: entering _queue_task() for managed_node3/shell 11683 1726853277.64641: worker is 1 (out of 1 available) 11683 1726853277.64657: exiting _queue_task() for managed_node3/shell 11683 1726853277.64669: done queuing things up, now waiting for results queue to drain 11683 1726853277.64872: waiting for pending results... 11683 1726853277.65104: running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services 11683 1726853277.65186: in run() - task 02083763-bbaf-c5b2-e075-0000000000c6 11683 1726853277.65200: variable 'ansible_search_path' from source: unknown 11683 1726853277.65204: variable 'ansible_search_path' from source: unknown 11683 1726853277.65260: calling self._execute() 11683 1726853277.65677: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853277.65683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853277.65686: variable 'omit' from source: magic vars 11683 1726853277.65777: variable 'ansible_distribution_major_version' from source: facts 11683 1726853277.65804: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853277.65807: variable 'omit' from source: magic vars 11683 1726853277.65858: variable 'omit' from source: magic vars 11683 1726853277.66002: variable 'omit' from source: magic vars 11683 1726853277.66042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853277.66115: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853277.66135: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853277.66155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853277.66167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853277.66348: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853277.66352: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853277.66354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853277.66357: Set connection var ansible_shell_executable to /bin/sh 11683 1726853277.66359: Set connection var ansible_timeout to 10 11683 1726853277.66361: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853277.66363: Set connection var ansible_pipelining to False 11683 1726853277.66366: Set connection var ansible_shell_type to sh 11683 1726853277.66368: Set connection var ansible_connection to ssh 11683 1726853277.66370: variable 'ansible_shell_executable' from source: unknown 11683 1726853277.66375: variable 'ansible_connection' from source: unknown 11683 1726853277.66378: variable 'ansible_module_compression' from source: unknown 11683 1726853277.66380: variable 'ansible_shell_type' from source: unknown 11683 1726853277.66382: variable 'ansible_shell_executable' from source: unknown 11683 1726853277.66384: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853277.66458: variable 'ansible_pipelining' from source: unknown 11683 1726853277.66461: variable 'ansible_timeout' from source: unknown 11683 1726853277.66464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853277.66566: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853277.66573: variable 'omit' from source: magic vars 11683 1726853277.66576: starting attempt loop 11683 1726853277.66579: running the handler 11683 1726853277.66582: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853277.66599: _low_level_execute_command(): starting 11683 1726853277.66607: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853277.67441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853277.67446: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.67524: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.69248: stdout chunk (state=3): >>>/root <<< 11683 1726853277.69389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.69405: stdout chunk (state=3): >>><<< 11683 1726853277.69440: stderr chunk (state=3): >>><<< 11683 1726853277.69464: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853277.69508: _low_level_execute_command(): starting 11683 1726853277.69512: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861 `" && echo ansible-tmp-1726853277.6947386-13089-222981912966861="` echo /root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861 `" ) && sleep 0' 11683 1726853277.70164: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853277.70180: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853277.70195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853277.70212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853277.70242: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853277.70344: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.70380: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.70483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.72465: stdout chunk (state=3): >>>ansible-tmp-1726853277.6947386-13089-222981912966861=/root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861 <<< 11683 1726853277.72633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.72636: stdout chunk (state=3): >>><<< 11683 1726853277.72639: stderr chunk (state=3): >>><<< 11683 1726853277.72654: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853277.6947386-13089-222981912966861=/root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853277.72692: variable 'ansible_module_compression' from source: unknown 11683 1726853277.72787: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853277.72800: variable 'ansible_facts' from source: unknown 11683 1726853277.72899: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/AnsiballZ_command.py 11683 1726853277.73120: Sending initial data 11683 1726853277.73129: Sent initial data (156 bytes) 11683 1726853277.73625: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853277.73664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853277.73685: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853277.73761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.73785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.73887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.75575: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853277.75618: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853277.75691: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpmcnlu52x /root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/AnsiballZ_command.py <<< 11683 1726853277.75702: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/AnsiballZ_command.py" <<< 11683 1726853277.75758: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpmcnlu52x" to remote "/root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/AnsiballZ_command.py" <<< 11683 1726853277.76678: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.76716: stderr chunk (state=3): >>><<< 11683 1726853277.76726: stdout chunk (state=3): >>><<< 11683 1726853277.76769: done transferring module to remote 11683 1726853277.76850: _low_level_execute_command(): starting 11683 1726853277.76854: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/ /root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/AnsiballZ_command.py && sleep 0' 11683 1726853277.77481: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853277.77618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853277.77622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.77649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.77748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.79645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853277.79655: stdout chunk (state=3): >>><<< 11683 1726853277.79665: stderr chunk (state=3): >>><<< 11683 1726853277.79686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853277.79694: _low_level_execute_command(): starting 11683 1726853277.79703: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/AnsiballZ_command.py && sleep 0' 11683 1726853277.80439: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853277.80446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853277.80450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 11683 1726853277.80452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853277.80455: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853277.80522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853277.80525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853277.80529: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853277.80605: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853277.99064: stdout chunk (state=3): >>> <<< 11683 1726853277.99082: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:27:57.962138", "end": "2024-09-20 13:27:57.989128", "delta": "0:00:00.026990", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853278.01164: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853278.01168: stdout chunk (state=3): >>><<< 11683 1726853278.01172: stderr chunk (state=3): >>><<< 11683 1726853278.01175: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 13:27:57.962138", "end": "2024-09-20 13:27:57.989128", "delta": "0:00:00.026990", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853278.01185: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853278.01187: _low_level_execute_command(): starting 11683 1726853278.01189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853277.6947386-13089-222981912966861/ > /dev/null 2>&1 && sleep 0' 11683 1726853278.02177: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.02260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853278.02287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.02340: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.02430: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.04429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.04445: stdout chunk (state=3): >>><<< 11683 1726853278.04498: stderr chunk (state=3): >>><<< 11683 1726853278.04633: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853278.04654: handler run complete 11683 1726853278.04808: Evaluated conditional (False): False 11683 1726853278.04826: attempt loop complete, returning result 11683 1726853278.04893: _execute() done 11683 1726853278.04901: dumping result to json 11683 1726853278.04914: done dumping result, returning 11683 1726853278.04936: done running TaskExecutor() for managed_node3/TASK: Stop dnsmasq/radvd services [02083763-bbaf-c5b2-e075-0000000000c6] 11683 1726853278.04950: sending task result for task 02083763-bbaf-c5b2-e075-0000000000c6 11683 1726853278.05323: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000c6 11683 1726853278.05326: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.026990", "end": "2024-09-20 13:27:57.989128", "rc": 0, "start": "2024-09-20 13:27:57.962138" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11683 1726853278.05402: no more pending results, returning what we have 11683 1726853278.05407: results queue empty 11683 1726853278.05408: checking for any_errors_fatal 11683 1726853278.05417: done checking for any_errors_fatal 11683 1726853278.05418: checking for max_fail_percentage 11683 1726853278.05422: done checking for max_fail_percentage 11683 1726853278.05422: checking to see if all hosts have failed and the running result is not ok 11683 1726853278.05424: done checking to see if all hosts have failed 11683 1726853278.05424: getting the remaining hosts for this loop 11683 1726853278.05426: done getting the remaining hosts for this loop 11683 1726853278.05430: getting the next task for host managed_node3 11683 1726853278.05439: done getting next task for host managed_node3 11683 1726853278.05441: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 11683 1726853278.05447: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853278.05453: getting variables 11683 1726853278.05455: in VariableManager get_vars() 11683 1726853278.05502: Calling all_inventory to load vars for managed_node3 11683 1726853278.05505: Calling groups_inventory to load vars for managed_node3 11683 1726853278.05507: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853278.05520: Calling all_plugins_play to load vars for managed_node3 11683 1726853278.05523: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853278.05526: Calling groups_plugins_play to load vars for managed_node3 11683 1726853278.07526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853278.09930: done with get_vars() 11683 1726853278.09970: done getting variables 11683 1726853278.10038: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Friday 20 September 2024 13:27:58 -0400 (0:00:00.458) 0:00:31.173 ****** 11683 1726853278.10178: entering _queue_task() for managed_node3/command 11683 1726853278.10616: worker is 1 (out of 1 available) 11683 1726853278.10630: exiting _queue_task() for managed_node3/command 11683 1726853278.10646: done queuing things up, now waiting for results queue to drain 11683 1726853278.10648: waiting for pending results... 11683 1726853278.11273: running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript 11683 1726853278.11476: in run() - task 02083763-bbaf-c5b2-e075-0000000000c7 11683 1726853278.12030: variable 'ansible_search_path' from source: unknown 11683 1726853278.12125: calling self._execute() 11683 1726853278.12370: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853278.12401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853278.12412: variable 'omit' from source: magic vars 11683 1726853278.13279: variable 'ansible_distribution_major_version' from source: facts 11683 1726853278.13292: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853278.13397: variable 'network_provider' from source: set_fact 11683 1726853278.13401: Evaluated conditional (network_provider == "initscripts"): False 11683 1726853278.13404: when evaluation is False, skipping this task 11683 1726853278.13407: _execute() done 11683 1726853278.13433: dumping result to json 11683 1726853278.13436: done dumping result, returning 11683 1726853278.13439: done running TaskExecutor() for managed_node3/TASK: Restore the /etc/resolv.conf for initscript [02083763-bbaf-c5b2-e075-0000000000c7] 11683 1726853278.13442: sending task result for task 02083763-bbaf-c5b2-e075-0000000000c7 11683 1726853278.13524: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000c7 11683 1726853278.13527: WORKER PROCESS EXITING skipping: [managed_node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11683 1726853278.13589: no more pending results, returning what we have 11683 1726853278.13594: results queue empty 11683 1726853278.13595: checking for any_errors_fatal 11683 1726853278.13606: done checking for any_errors_fatal 11683 1726853278.13607: checking for max_fail_percentage 11683 1726853278.13609: done checking for max_fail_percentage 11683 1726853278.13609: checking to see if all hosts have failed and the running result is not ok 11683 1726853278.13610: done checking to see if all hosts have failed 11683 1726853278.13611: getting the remaining hosts for this loop 11683 1726853278.13613: done getting the remaining hosts for this loop 11683 1726853278.13615: getting the next task for host managed_node3 11683 1726853278.13622: done getting next task for host managed_node3 11683 1726853278.13625: ^ task is: TASK: Verify network state restored to default 11683 1726853278.13628: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853278.13632: getting variables 11683 1726853278.13634: in VariableManager get_vars() 11683 1726853278.13680: Calling all_inventory to load vars for managed_node3 11683 1726853278.13683: Calling groups_inventory to load vars for managed_node3 11683 1726853278.13685: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853278.13698: Calling all_plugins_play to load vars for managed_node3 11683 1726853278.13700: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853278.13702: Calling groups_plugins_play to load vars for managed_node3 11683 1726853278.17388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853278.19455: done with get_vars() 11683 1726853278.19490: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Friday 20 September 2024 13:27:58 -0400 (0:00:00.094) 0:00:31.267 ****** 11683 1726853278.19587: entering _queue_task() for managed_node3/include_tasks 11683 1726853278.19942: worker is 1 (out of 1 available) 11683 1726853278.19957: exiting _queue_task() for managed_node3/include_tasks 11683 1726853278.19970: done queuing things up, now waiting for results queue to drain 11683 1726853278.19976: waiting for pending results... 11683 1726853278.20332: running TaskExecutor() for managed_node3/TASK: Verify network state restored to default 11683 1726853278.20430: in run() - task 02083763-bbaf-c5b2-e075-0000000000c8 11683 1726853278.20435: variable 'ansible_search_path' from source: unknown 11683 1726853278.20461: calling self._execute() 11683 1726853278.20573: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853278.20587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853278.20606: variable 'omit' from source: magic vars 11683 1726853278.21078: variable 'ansible_distribution_major_version' from source: facts 11683 1726853278.21081: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853278.21083: _execute() done 11683 1726853278.21085: dumping result to json 11683 1726853278.21087: done dumping result, returning 11683 1726853278.21089: done running TaskExecutor() for managed_node3/TASK: Verify network state restored to default [02083763-bbaf-c5b2-e075-0000000000c8] 11683 1726853278.21091: sending task result for task 02083763-bbaf-c5b2-e075-0000000000c8 11683 1726853278.21165: done sending task result for task 02083763-bbaf-c5b2-e075-0000000000c8 11683 1726853278.21168: WORKER PROCESS EXITING 11683 1726853278.21198: no more pending results, returning what we have 11683 1726853278.21203: in VariableManager get_vars() 11683 1726853278.21250: Calling all_inventory to load vars for managed_node3 11683 1726853278.21252: Calling groups_inventory to load vars for managed_node3 11683 1726853278.21254: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853278.21267: Calling all_plugins_play to load vars for managed_node3 11683 1726853278.21270: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853278.21378: Calling groups_plugins_play to load vars for managed_node3 11683 1726853278.22922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853278.24557: done with get_vars() 11683 1726853278.24586: variable 'ansible_search_path' from source: unknown 11683 1726853278.24603: we have included files to process 11683 1726853278.24604: generating all_blocks data 11683 1726853278.24606: done generating all_blocks data 11683 1726853278.24612: processing included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11683 1726853278.24613: loading included file: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11683 1726853278.24616: Loading data from /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11683 1726853278.25063: done processing included file 11683 1726853278.25065: iterating over new_blocks loaded from include file 11683 1726853278.25067: in VariableManager get_vars() 11683 1726853278.25088: done with get_vars() 11683 1726853278.25090: filtering new block on tags 11683 1726853278.25132: done filtering new block on tags 11683 1726853278.25135: done iterating over new_blocks loaded from include file included: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node3 11683 1726853278.25140: extending task lists for all hosts with included blocks 11683 1726853278.26414: done extending task lists 11683 1726853278.26416: done processing included files 11683 1726853278.26417: results queue empty 11683 1726853278.26417: checking for any_errors_fatal 11683 1726853278.26420: done checking for any_errors_fatal 11683 1726853278.26421: checking for max_fail_percentage 11683 1726853278.26422: done checking for max_fail_percentage 11683 1726853278.26422: checking to see if all hosts have failed and the running result is not ok 11683 1726853278.26423: done checking to see if all hosts have failed 11683 1726853278.26424: getting the remaining hosts for this loop 11683 1726853278.26425: done getting the remaining hosts for this loop 11683 1726853278.26431: getting the next task for host managed_node3 11683 1726853278.26435: done getting next task for host managed_node3 11683 1726853278.26437: ^ task is: TASK: Check routes and DNS 11683 1726853278.26440: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853278.26444: getting variables 11683 1726853278.26445: in VariableManager get_vars() 11683 1726853278.26459: Calling all_inventory to load vars for managed_node3 11683 1726853278.26461: Calling groups_inventory to load vars for managed_node3 11683 1726853278.26463: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853278.26468: Calling all_plugins_play to load vars for managed_node3 11683 1726853278.26470: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853278.26475: Calling groups_plugins_play to load vars for managed_node3 11683 1726853278.27759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853278.29333: done with get_vars() 11683 1726853278.29377: done getting variables 11683 1726853278.29428: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 13:27:58 -0400 (0:00:00.098) 0:00:31.366 ****** 11683 1726853278.29470: entering _queue_task() for managed_node3/shell 11683 1726853278.29861: worker is 1 (out of 1 available) 11683 1726853278.29878: exiting _queue_task() for managed_node3/shell 11683 1726853278.29891: done queuing things up, now waiting for results queue to drain 11683 1726853278.29893: waiting for pending results... 11683 1726853278.30249: running TaskExecutor() for managed_node3/TASK: Check routes and DNS 11683 1726853278.30284: in run() - task 02083763-bbaf-c5b2-e075-00000000056d 11683 1726853278.30303: variable 'ansible_search_path' from source: unknown 11683 1726853278.30307: variable 'ansible_search_path' from source: unknown 11683 1726853278.30344: calling self._execute() 11683 1726853278.30453: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853278.30457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853278.30465: variable 'omit' from source: magic vars 11683 1726853278.30855: variable 'ansible_distribution_major_version' from source: facts 11683 1726853278.30867: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853278.30874: variable 'omit' from source: magic vars 11683 1726853278.30922: variable 'omit' from source: magic vars 11683 1726853278.30964: variable 'omit' from source: magic vars 11683 1726853278.31004: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853278.31176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853278.31180: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853278.31182: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853278.31184: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853278.31187: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853278.31189: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853278.31191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853278.31248: Set connection var ansible_shell_executable to /bin/sh 11683 1726853278.31257: Set connection var ansible_timeout to 10 11683 1726853278.31265: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853278.31270: Set connection var ansible_pipelining to False 11683 1726853278.31275: Set connection var ansible_shell_type to sh 11683 1726853278.31282: Set connection var ansible_connection to ssh 11683 1726853278.31308: variable 'ansible_shell_executable' from source: unknown 11683 1726853278.31311: variable 'ansible_connection' from source: unknown 11683 1726853278.31314: variable 'ansible_module_compression' from source: unknown 11683 1726853278.31317: variable 'ansible_shell_type' from source: unknown 11683 1726853278.31319: variable 'ansible_shell_executable' from source: unknown 11683 1726853278.31321: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853278.31323: variable 'ansible_pipelining' from source: unknown 11683 1726853278.31325: variable 'ansible_timeout' from source: unknown 11683 1726853278.31328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853278.31476: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853278.31486: variable 'omit' from source: magic vars 11683 1726853278.31496: starting attempt loop 11683 1726853278.31499: running the handler 11683 1726853278.31510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853278.31528: _low_level_execute_command(): starting 11683 1726853278.31537: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853278.32369: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.32420: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.32481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.34350: stdout chunk (state=3): >>>/root <<< 11683 1726853278.34438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.34474: stdout chunk (state=3): >>><<< 11683 1726853278.34477: stderr chunk (state=3): >>><<< 11683 1726853278.34498: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853278.34598: _low_level_execute_command(): starting 11683 1726853278.34602: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073 `" && echo ansible-tmp-1726853278.3450682-13130-34679718718073="` echo /root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073 `" ) && sleep 0' 11683 1726853278.35613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853278.35616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.35627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration <<< 11683 1726853278.35629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found <<< 11683 1726853278.35632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.35688: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.35706: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.35778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.37796: stdout chunk (state=3): >>>ansible-tmp-1726853278.3450682-13130-34679718718073=/root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073 <<< 11683 1726853278.38578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.38582: stdout chunk (state=3): >>><<< 11683 1726853278.38585: stderr chunk (state=3): >>><<< 11683 1726853278.38590: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853278.3450682-13130-34679718718073=/root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853278.38593: variable 'ansible_module_compression' from source: unknown 11683 1726853278.38596: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853278.38599: variable 'ansible_facts' from source: unknown 11683 1726853278.38744: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/AnsiballZ_command.py 11683 1726853278.39412: Sending initial data 11683 1726853278.39416: Sent initial data (155 bytes) 11683 1726853278.40054: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853278.40092: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853278.40106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853278.40122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853278.40136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853278.40145: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853278.40186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.40251: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853278.40290: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.40322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.40380: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.42123: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853278.42421: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853278.42464: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp7u8v39_1 /root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/AnsiballZ_command.py <<< 11683 1726853278.42467: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/AnsiballZ_command.py" <<< 11683 1726853278.42540: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmp7u8v39_1" to remote "/root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/AnsiballZ_command.py" <<< 11683 1726853278.43741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.43822: stderr chunk (state=3): >>><<< 11683 1726853278.43832: stdout chunk (state=3): >>><<< 11683 1726853278.43902: done transferring module to remote 11683 1726853278.43923: _low_level_execute_command(): starting 11683 1726853278.43934: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/ /root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/AnsiballZ_command.py && sleep 0' 11683 1726853278.44569: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853278.44586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853278.44668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.44712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853278.44727: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.44748: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.44844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.46788: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.46792: stdout chunk (state=3): >>><<< 11683 1726853278.46795: stderr chunk (state=3): >>><<< 11683 1726853278.46856: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853278.46947: _low_level_execute_command(): starting 11683 1726853278.46951: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/AnsiballZ_command.py && sleep 0' 11683 1726853278.47483: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853278.47497: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853278.47510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853278.47524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853278.47538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853278.47587: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.47645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853278.47688: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.47748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.64359: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3202sec preferred_lft 3202sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:27:58.632969", "end": "2024-09-20 13:27:58.641858", "delta": "0:00:00.008889", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853278.66046: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. <<< 11683 1726853278.66050: stdout chunk (state=3): >>><<< 11683 1726853278.66053: stderr chunk (state=3): >>><<< 11683 1726853278.66177: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3202sec preferred_lft 3202sec\n inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 13:27:58.632969", "end": "2024-09-20 13:27:58.641858", "delta": "0:00:00.008889", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853278.66187: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853278.66190: _low_level_execute_command(): starting 11683 1726853278.66193: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853278.3450682-13130-34679718718073/ > /dev/null 2>&1 && sleep 0' 11683 1726853278.66791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853278.66799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853278.66809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853278.66822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853278.66832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853278.66874: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.66931: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853278.66952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.66961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.67049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.69063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.69067: stdout chunk (state=3): >>><<< 11683 1726853278.69069: stderr chunk (state=3): >>><<< 11683 1726853278.69091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853278.69097: handler run complete 11683 1726853278.69121: Evaluated conditional (False): False 11683 1726853278.69131: attempt loop complete, returning result 11683 1726853278.69134: _execute() done 11683 1726853278.69136: dumping result to json 11683 1726853278.69145: done dumping result, returning 11683 1726853278.69151: done running TaskExecutor() for managed_node3/TASK: Check routes and DNS [02083763-bbaf-c5b2-e075-00000000056d] 11683 1726853278.69156: sending task result for task 02083763-bbaf-c5b2-e075-00000000056d 11683 1726853278.69268: done sending task result for task 02083763-bbaf-c5b2-e075-00000000056d 11683 1726853278.69378: WORKER PROCESS EXITING ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008889", "end": "2024-09-20 13:27:58.641858", "rc": 0, "start": "2024-09-20 13:27:58.632969" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:2a:53:36:f0:e9 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.217/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3202sec preferred_lft 3202sec inet6 fe80::102a:53ff:fe36:f0e9/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.217 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.217 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11683 1726853278.69451: no more pending results, returning what we have 11683 1726853278.69454: results queue empty 11683 1726853278.69455: checking for any_errors_fatal 11683 1726853278.69457: done checking for any_errors_fatal 11683 1726853278.69457: checking for max_fail_percentage 11683 1726853278.69459: done checking for max_fail_percentage 11683 1726853278.69459: checking to see if all hosts have failed and the running result is not ok 11683 1726853278.69460: done checking to see if all hosts have failed 11683 1726853278.69461: getting the remaining hosts for this loop 11683 1726853278.69462: done getting the remaining hosts for this loop 11683 1726853278.69465: getting the next task for host managed_node3 11683 1726853278.69473: done getting next task for host managed_node3 11683 1726853278.69475: ^ task is: TASK: Verify DNS and network connectivity 11683 1726853278.69478: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11683 1726853278.69487: getting variables 11683 1726853278.69488: in VariableManager get_vars() 11683 1726853278.69522: Calling all_inventory to load vars for managed_node3 11683 1726853278.69524: Calling groups_inventory to load vars for managed_node3 11683 1726853278.69526: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853278.69535: Calling all_plugins_play to load vars for managed_node3 11683 1726853278.69538: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853278.69540: Calling groups_plugins_play to load vars for managed_node3 11683 1726853278.70951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853278.72859: done with get_vars() 11683 1726853278.72889: done getting variables 11683 1726853278.72957: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 13:27:58 -0400 (0:00:00.435) 0:00:31.801 ****** 11683 1726853278.72991: entering _queue_task() for managed_node3/shell 11683 1726853278.73351: worker is 1 (out of 1 available) 11683 1726853278.73477: exiting _queue_task() for managed_node3/shell 11683 1726853278.73487: done queuing things up, now waiting for results queue to drain 11683 1726853278.73489: waiting for pending results... 11683 1726853278.73728: running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity 11683 1726853278.73910: in run() - task 02083763-bbaf-c5b2-e075-00000000056e 11683 1726853278.73914: variable 'ansible_search_path' from source: unknown 11683 1726853278.73929: variable 'ansible_search_path' from source: unknown 11683 1726853278.74017: calling self._execute() 11683 1726853278.74116: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853278.74132: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853278.74154: variable 'omit' from source: magic vars 11683 1726853278.74552: variable 'ansible_distribution_major_version' from source: facts 11683 1726853278.74575: Evaluated conditional (ansible_distribution_major_version != '6'): True 11683 1726853278.74780: variable 'ansible_facts' from source: unknown 11683 1726853278.75481: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11683 1726853278.75493: variable 'omit' from source: magic vars 11683 1726853278.75565: variable 'omit' from source: magic vars 11683 1726853278.75609: variable 'omit' from source: magic vars 11683 1726853278.75665: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11683 1726853278.75708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11683 1726853278.75735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11683 1726853278.75768: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853278.75789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11683 1726853278.75826: variable 'inventory_hostname' from source: host vars for 'managed_node3' 11683 1726853278.75835: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853278.75845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853278.75963: Set connection var ansible_shell_executable to /bin/sh 11683 1726853278.75990: Set connection var ansible_timeout to 10 11683 1726853278.76004: Set connection var ansible_module_compression to ZIP_DEFLATED 11683 1726853278.76103: Set connection var ansible_pipelining to False 11683 1726853278.76106: Set connection var ansible_shell_type to sh 11683 1726853278.76109: Set connection var ansible_connection to ssh 11683 1726853278.76112: variable 'ansible_shell_executable' from source: unknown 11683 1726853278.76113: variable 'ansible_connection' from source: unknown 11683 1726853278.76155: variable 'ansible_module_compression' from source: unknown 11683 1726853278.76164: variable 'ansible_shell_type' from source: unknown 11683 1726853278.76488: variable 'ansible_shell_executable' from source: unknown 11683 1726853278.76491: variable 'ansible_host' from source: host vars for 'managed_node3' 11683 1726853278.76493: variable 'ansible_pipelining' from source: unknown 11683 1726853278.76495: variable 'ansible_timeout' from source: unknown 11683 1726853278.76498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node3' 11683 1726853278.76501: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853278.76503: variable 'omit' from source: magic vars 11683 1726853278.76593: starting attempt loop 11683 1726853278.76601: running the handler 11683 1726853278.76616: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11683 1726853278.76658: _low_level_execute_command(): starting 11683 1726853278.76709: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11683 1726853278.78214: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853278.78235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11683 1726853278.78367: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.78413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853278.78487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.78500: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.78634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.80392: stdout chunk (state=3): >>>/root <<< 11683 1726853278.80450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.80488: stderr chunk (state=3): >>><<< 11683 1726853278.80496: stdout chunk (state=3): >>><<< 11683 1726853278.80528: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853278.80548: _low_level_execute_command(): starting 11683 1726853278.80558: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522 `" && echo ansible-tmp-1726853278.8053524-13155-85243723724522="` echo /root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522 `" ) && sleep 0' 11683 1726853278.81446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853278.81474: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853278.81601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.81619: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853278.81635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.81850: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.81941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.83976: stdout chunk (state=3): >>>ansible-tmp-1726853278.8053524-13155-85243723724522=/root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522 <<< 11683 1726853278.84127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.84138: stdout chunk (state=3): >>><<< 11683 1726853278.84160: stderr chunk (state=3): >>><<< 11683 1726853278.84190: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726853278.8053524-13155-85243723724522=/root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853278.84235: variable 'ansible_module_compression' from source: unknown 11683 1726853278.84314: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-11683xn3gfh52/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11683 1726853278.84373: variable 'ansible_facts' from source: unknown 11683 1726853278.84465: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/AnsiballZ_command.py 11683 1726853278.84701: Sending initial data 11683 1726853278.84704: Sent initial data (155 bytes) 11683 1726853278.85697: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.85744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.85836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.87503: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11683 1726853278.87528: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11683 1726853278.87622: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11683 1726853278.87685: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpdhvgybq3 /root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/AnsiballZ_command.py <<< 11683 1726853278.87700: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/AnsiballZ_command.py" <<< 11683 1726853278.87735: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-11683xn3gfh52/tmpdhvgybq3" to remote "/root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/AnsiballZ_command.py" <<< 11683 1726853278.88984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.88989: stdout chunk (state=3): >>><<< 11683 1726853278.88991: stderr chunk (state=3): >>><<< 11683 1726853278.88994: done transferring module to remote 11683 1726853278.88996: _low_level_execute_command(): starting 11683 1726853278.88998: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/ /root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/AnsiballZ_command.py && sleep 0' 11683 1726853278.89965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853278.90004: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853278.90021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853278.90040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11683 1726853278.90059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 <<< 11683 1726853278.90074: stderr chunk (state=3): >>>debug2: match not found <<< 11683 1726853278.90111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853278.90196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853278.90392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.90477: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853278.92481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853278.92485: stdout chunk (state=3): >>><<< 11683 1726853278.92488: stderr chunk (state=3): >>><<< 11683 1726853278.92491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853278.92493: _low_level_execute_command(): starting 11683 1726853278.92495: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/AnsiballZ_command.py && sleep 0' 11683 1726853278.93437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853278.93442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found <<< 11683 1726853278.93501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.93507: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853278.93596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11683 1726853278.93599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853278.93602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853278.93711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853278.93839: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853279.24693: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 5739 0 --:--:-- --:--:-- --:--:-- 5754\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3829 0 --:--:-- --:--:-- --:--:-- 3880", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:27:59.092290", "end": "2024-09-20 13:27:59.245547", "delta": "0:00:00.153257", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11683 1726853279.26465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853279.26529: stderr chunk (state=3): >>>Shared connection to 10.31.11.217 closed. <<< 11683 1726853279.26533: stdout chunk (state=3): >>><<< 11683 1726853279.26675: stderr chunk (state=3): >>><<< 11683 1726853279.26681: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 5739 0 --:--:-- --:--:-- --:--:-- 5754\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 3829 0 --:--:-- --:--:-- --:--:-- 3880", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 13:27:59.092290", "end": "2024-09-20 13:27:59.245547", "delta": "0:00:00.153257", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.217 closed. 11683 1726853279.26690: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11683 1726853279.26692: _low_level_execute_command(): starting 11683 1726853279.26694: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726853278.8053524-13155-85243723724522/ > /dev/null 2>&1 && sleep 0' 11683 1726853279.27592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11683 1726853279.27610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11683 1726853279.27688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11683 1726853279.27724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' <<< 11683 1726853279.27748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11683 1726853279.27767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11683 1726853279.27932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11683 1726853279.29927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11683 1726853279.30106: stderr chunk (state=3): >>><<< 11683 1726853279.30116: stdout chunk (state=3): >>><<< 11683 1726853279.30138: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.217 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.217 originally 10.31.11.217 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bee039678b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11683 1726853279.30150: handler run complete 11683 1726853279.30178: Evaluated conditional (False): False 11683 1726853279.30192: attempt loop complete, returning result 11683 1726853279.30199: _execute() done 11683 1726853279.30205: dumping result to json 11683 1726853279.30215: done dumping result, returning 11683 1726853279.30228: done running TaskExecutor() for managed_node3/TASK: Verify DNS and network connectivity [02083763-bbaf-c5b2-e075-00000000056e] 11683 1726853279.30237: sending task result for task 02083763-bbaf-c5b2-e075-00000000056e ok: [managed_node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.153257", "end": "2024-09-20 13:27:59.245547", "rc": 0, "start": "2024-09-20 13:27:59.092290" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 5739 0 --:--:-- --:--:-- --:--:-- 5754 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 3829 0 --:--:-- --:--:-- --:--:-- 3880 11683 1726853279.30583: no more pending results, returning what we have 11683 1726853279.30586: results queue empty 11683 1726853279.30587: checking for any_errors_fatal 11683 1726853279.30596: done checking for any_errors_fatal 11683 1726853279.30596: checking for max_fail_percentage 11683 1726853279.30598: done checking for max_fail_percentage 11683 1726853279.30599: checking to see if all hosts have failed and the running result is not ok 11683 1726853279.30600: done checking to see if all hosts have failed 11683 1726853279.30600: getting the remaining hosts for this loop 11683 1726853279.30602: done getting the remaining hosts for this loop 11683 1726853279.30604: getting the next task for host managed_node3 11683 1726853279.30613: done getting next task for host managed_node3 11683 1726853279.30614: ^ task is: TASK: meta (flush_handlers) 11683 1726853279.30622: done sending task result for task 02083763-bbaf-c5b2-e075-00000000056e 11683 1726853279.30661: WORKER PROCESS EXITING 11683 1726853279.30655: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853279.30679: getting variables 11683 1726853279.30681: in VariableManager get_vars() 11683 1726853279.30716: Calling all_inventory to load vars for managed_node3 11683 1726853279.30719: Calling groups_inventory to load vars for managed_node3 11683 1726853279.30721: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853279.30805: Calling all_plugins_play to load vars for managed_node3 11683 1726853279.30809: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853279.30813: Calling groups_plugins_play to load vars for managed_node3 11683 1726853279.32837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853279.34501: done with get_vars() 11683 1726853279.34529: done getting variables 11683 1726853279.34612: in VariableManager get_vars() 11683 1726853279.34629: Calling all_inventory to load vars for managed_node3 11683 1726853279.34631: Calling groups_inventory to load vars for managed_node3 11683 1726853279.34633: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853279.34638: Calling all_plugins_play to load vars for managed_node3 11683 1726853279.34641: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853279.34647: Calling groups_plugins_play to load vars for managed_node3 11683 1726853279.36048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853279.37695: done with get_vars() 11683 1726853279.37728: done queuing things up, now waiting for results queue to drain 11683 1726853279.37730: results queue empty 11683 1726853279.37731: checking for any_errors_fatal 11683 1726853279.37735: done checking for any_errors_fatal 11683 1726853279.37736: checking for max_fail_percentage 11683 1726853279.37745: done checking for max_fail_percentage 11683 1726853279.37746: checking to see if all hosts have failed and the running result is not ok 11683 1726853279.37747: done checking to see if all hosts have failed 11683 1726853279.37747: getting the remaining hosts for this loop 11683 1726853279.37748: done getting the remaining hosts for this loop 11683 1726853279.37752: getting the next task for host managed_node3 11683 1726853279.37755: done getting next task for host managed_node3 11683 1726853279.37757: ^ task is: TASK: meta (flush_handlers) 11683 1726853279.37758: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853279.37761: getting variables 11683 1726853279.37762: in VariableManager get_vars() 11683 1726853279.37779: Calling all_inventory to load vars for managed_node3 11683 1726853279.37782: Calling groups_inventory to load vars for managed_node3 11683 1726853279.37784: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853279.37789: Calling all_plugins_play to load vars for managed_node3 11683 1726853279.37792: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853279.37794: Calling groups_plugins_play to load vars for managed_node3 11683 1726853279.39084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853279.40609: done with get_vars() 11683 1726853279.40632: done getting variables 11683 1726853279.40693: in VariableManager get_vars() 11683 1726853279.40708: Calling all_inventory to load vars for managed_node3 11683 1726853279.40710: Calling groups_inventory to load vars for managed_node3 11683 1726853279.40712: Calling all_plugins_inventory to load vars for managed_node3 11683 1726853279.40717: Calling all_plugins_play to load vars for managed_node3 11683 1726853279.40720: Calling groups_plugins_inventory to load vars for managed_node3 11683 1726853279.40722: Calling groups_plugins_play to load vars for managed_node3 11683 1726853279.41884: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11683 1726853279.43636: done with get_vars() 11683 1726853279.43665: done queuing things up, now waiting for results queue to drain 11683 1726853279.43667: results queue empty 11683 1726853279.43668: checking for any_errors_fatal 11683 1726853279.43669: done checking for any_errors_fatal 11683 1726853279.43670: checking for max_fail_percentage 11683 1726853279.43673: done checking for max_fail_percentage 11683 1726853279.43674: checking to see if all hosts have failed and the running result is not ok 11683 1726853279.43674: done checking to see if all hosts have failed 11683 1726853279.43675: getting the remaining hosts for this loop 11683 1726853279.43677: done getting the remaining hosts for this loop 11683 1726853279.43680: getting the next task for host managed_node3 11683 1726853279.43684: done getting next task for host managed_node3 11683 1726853279.43685: ^ task is: None 11683 1726853279.43686: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11683 1726853279.43687: done queuing things up, now waiting for results queue to drain 11683 1726853279.43688: results queue empty 11683 1726853279.43689: checking for any_errors_fatal 11683 1726853279.43690: done checking for any_errors_fatal 11683 1726853279.43691: checking for max_fail_percentage 11683 1726853279.43692: done checking for max_fail_percentage 11683 1726853279.43692: checking to see if all hosts have failed and the running result is not ok 11683 1726853279.43693: done checking to see if all hosts have failed 11683 1726853279.43695: getting the next task for host managed_node3 11683 1726853279.43697: done getting next task for host managed_node3 11683 1726853279.43698: ^ task is: None 11683 1726853279.43699: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node3 : ok=76 changed=2 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Friday 20 September 2024 13:27:59 -0400 (0:00:00.707) 0:00:32.509 ****** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.93s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.90s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.82s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.30s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.16s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Install dnsmasq --------------------------------------------------------- 1.05s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.97s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.94s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.93s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 0.89s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Check which packages are installed --- 0.86s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.85s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gather the minimum subset of ansible_facts required by the network role test --- 0.81s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Install pgrep, sysctl --------------------------------------------------- 0.78s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Verify DNS and network connectivity ------------------------------------- 0.71s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.67s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.58s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Get NM profile info ----------------------------------------------------- 0.49s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Get NM profile info ----------------------------------------------------- 0.49s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Remove test interfaces -------------------------------------------------- 0.47s /tmp/collections-Qi7/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 11683 1726853279.43814: RUNNING CLEANUP